Professional Documents
Culture Documents
net/publication/320020529
CITATIONS READS
0 1,789
1 author:
Andrzej J. Gapinski
Pennsylvania State University
51 PUBLICATIONS 52 CITATIONS
SEE PROFILE
All content following this page was uploaded by Andrzej J. Gapinski on 25 September 2017.
It is with pleasure that we present to you the Proceedings of the 2008 International
Conference on Industry, Engineering and Management Systems (IEMS). The
papers presented this year were of consistently high quality in their respective
fields. The papers cover a wide range of topics in the business and engineering
disciplines, integrating concepts that further the mission of the IEMS Conference.
These proceedings would not have been made possible without the valuable
contributions of our Track Chairs for the time and effort they spent reviewing all
the papers; and for our Administrative Coordinator, Elki Issa, whose work behind
the scenes helps make our Conference a success.
Warm Regards,
TRACK CHAIRS
Accounting/Finance Lean Six Sigma
LuAnn Bean, Florida Institute of Technology Sandra Furterer, Univ. of Central Florida
Abstract Concepts in Computer Science Management Information Systems
Dia Ali , University of Southern Mississippi John Wang, Montclair State University
Automation/Intelligent Computing Management & Organizational Behavior
Andrzej Gapinski, Penn State University Ed Hernandez, Cal State Univ., Stanislaus
Construction Management Management of Technology
Mostafa Khattab, Colorado State University Gordon Arbogast, Jacksonville University
Decision Making in Mgmt & Engineering Marketing
E. Ertugrul Karsak, GalatasaryUniversity Kaylene Williams, CSU, Stanislaus
Decision Support Systems Operations Management
Dia Ali , University of Southern Mississippi J.S. Sutterfield, Florida A&M University
Education and Training Project Management
Ralph Janaro, Clarkson University Stephen Allen, Truman State University
Entrepreneurship Quality Management
Colin Benjamin, Florida A&M University Hesham Mahgoub, South Dakota State Univ
Global Applications Simulation and Modeling
Djehane Hosni, University of Central Florida Kevin O’Neill, Plattsburgh State University
Human Computer Interaction Stat. Quality Improvement & Control
Mohammad Khasawneh, SUNY, Binghamton Gamal Weheba, Wichita State University
Human Engineering Supply Chain Management
Deborah Carstens, Florida Institute of Tech. Ken Morrison, Kettering University
Industry and Academia Collaboration Technology Commercialization
Alexandra Schönning, Univ. of North Florida Tiki Suarez, Florida A&M University
Lean Enterprise
Isabelina Nahmens, Univ of Central Florida
Proceedings of the 2008 IEMS Conference
TABLE OF CONTENTS
Stephen Allen 20
A LOOK AT CLOSING THE LOOP IN AN UNDERGRADUATE
PROJECT MANAGEMENT CLASS
Shahram Amiri 25
INFORMATION AND TELECOMMUNICATION TECHNOLOGY: BLUEPRINT FOR
SOCIAL-ECONOMIC GROWTH IN DEVELOPING NATIONS
Gordon Arbogast 31
DEDICATED ACCOUNT TEAMS: WORTH THE COST?
Andrzej Gapinski 80
PARTIAL DISCHARGE (PD): PSPICE SIMULATIONS
Alicia Combs, Mihaela Petrova-Bolli, Eric Tucker and Dana Johnston 147
APPLYING SIX SIGMA TO SERVICE ORGANIZATIONS: A CASE FOR THE
FLORIDA DEPARTMENT OF CHILDREN AND FAMILIES
1
Proceedings of the 2008 IEMS Conference
Figure 2a and b illustrate the HTA for a portion “CARVER+Shock” stands for the following six
of the process, when the milk transfer is attributes used to evaluate the attractiveness of a
complete and milk tank is empty. target for attack:
• Criticality: This attribute helps to
3. Wireless Security Monitoring System assess the critical impact an attack can
(SMS) have on public health and economy.
• Accessibility: This attribute helps to
In order to improve efficiency, reduce assess the ability to physically access
process errors and improve security, a wireless and egress from a target.
monitoring systems (SMS) integrated with GPS • Recuperability: This attribute helps to
and electronic locks was developed for milk assess ability of a system to recover
trucks [2]. The system has the ability to ensure from an attack.
that milk collected from the farm has only • Vulnerability: This attribute helps to
authorized access and that the samples collected assess ability to accomplish a successful
for further testing are stored at the right attack.
temperature and tested within the requisite time • Effect: This attribute helps to assess the
interval. The system records the correct amount amount of direct loss from an attack as
of milk collected; the miles traveled in different measured by loss of production.
states by the hauling truck; the service hours of • Recognizability: This attribute helps to
the drivers, and the wash times of the truck. It assess ease of identifying a target.
also enables optimal routing and cycle time of • Shock- This attribute helps to assess the
the milk trucks from collection through combined health, economic and
unloading at the processing plants [3], [4]. The psychological impacts of an attack
system also allows the printing of the required within the food industry.
documentation, such as barn tickets, milk
sample labels, wash tickets, etc. The assessment process using CARVER+ Shock
is as follows:
Figure 3a illustrates the (SMS) and Figure 3b 1. A process flow chart is constructed
defines the communication links in the SMS. using the tool.
2. Carver + Shock scores are based on
4. Assessment of the Improvement in answers to specific questions asked for
Security with SMS using CARVER + each process node in the flow chart.
Shock 3. The results are presented as a color-
coded process flow diagram colored in
“CARVER + Shock”, developed by the different shades of blue, based on the
Food and Drug Administration / Center for Food total CARVER + Shock score for each
Safety and Applied Nutrition (FDA/CFSAN) is process node. Process nodes with low
used by the United States Department of scores, that imply lower risk, will be
Agriculture Food Safety and Inspection Service light blue, becoming darker as the
(USDA FSIS) to assess the vulnerabilities within CARVER + Shock score increases.
a system or infrastructure to an attack [5]. It Double-clicking on a process node
allows the user to think like an attacker to displays the node’s individual
identify the most attractive targets for an attack. assessment, which provides more
By contrasting a “CARVER + Shock” detailed information regarding the risk
assessment of the current milk collection process assessment of the process node,
and transport process with the proposed process, including individual scores for each
one can assess the improvement in security with element that make up the CARVER +
SMS. Shock total score. Additional results are
displayed as a bar graph. Clicking on
each bar will bring up text with
2
Proceedings of the 2008 IEMS Conference
suggestions for risk mitigation and Systems Conference, March 10- 12,
improving security at process nodes. 2008.
5. Conclusions
References
3
Proceedings of the 2008 IEMS Conference
Hauler Clean-up
Reject load on Yes
arrives at appearance?
and Depart
farm
No
Yes
Record farm
info. and set-
up
4
Proceedings of the 2008 IEMS Conference
Figure 2a – Illustration of HTA for the Process Steps after the Milk Tank is Emptied
Figure 2 b – Areas of Possible Process Errors from the HTA of Process Steps in Figure 2a
5
Proceedings of the 2008 IEMS Conference
6
Proceedings of the 2008 IEMS Conference
SECURITY
LOCK POSITION, GPS MONITORING SYSTEM
DATA, TEMPERATURE (SMS)
INFORMATION, USER
INTERFACE
INFORMATION
Data Attribute
updates
7
Proceedings of the 2008 IEMS Conference
8
Proceedings of the 2008 IEMS Conference
1 4 2 1 3 10 1 23 Milk
1 1 2 3 3 10 1 22 Seal truck until next pick up. (Sealer)
1 1 5 1 3 10 1 22 Control Checks all the Locks (Other Control Checks)
1 1 4 1 3 10 1 21 Milk* (Tanker Truck Filler)
1 1 4 1 3 10 1 21 * Milk (Receiving)
1 1 4 1 3 10 1 21 Milk hauling Truck (Tanker Truck)
1 1 4 1 3 10 1 21 Clean/Sanitize Milk storage tank (Clean/Sanitize Equipment)
1 1 4 1 3 10 1 21 Milk * (Storage Tank Refrigerated)
1 1 4 1 3 10 1 21 Seal top and rare lids (Sealer)
1 1 4 1 3 10 1 21 * Milk (Pump)
1 1 4 1 3 10 1 21 go for next pick up (Transport)
1 1 4 1 3 10 1 21 Control Checks all the locks before receiving (Other Control Checks)
1 1 4 1 3 10 1 21 Clean/Sanitize tank after unloading (Clean/Sanitize Equipment)
1 1 2 1 3 10 1 20 Scan processing plant ID (Scanner)
Processing Plant * [2] (Storage Tank Refrigerated)
1 1 2 1 3 10 1 20
Scan driver farm and truck ID (Scanner)
1 1 2 1 3 10 1 20
9
Proceedings of the 2008 IEMS Conference
10
Proceedings of the 2008 IEMS Conference
routings also lead to increased fuel consumption model was developed with three hypothetical
and cost. Another common problem that occurs hauling companies of different sizes, servicing
frequently is the extensive delay in unloading clusters of farms and unloading at two
the milk at processing plants, owing to the processing stations, with different capacities.
limited capacity at the plants relative to the The preliminary simulation analysis and runs
number of trucks waiting to be unloaded. The with an optimization tool [OPTQUEST],
possibility of milk spoilage during this wait demonstrated that adaptive scheduling can
time, costly truck idle time and extensions of a indeed have a positive impact on reducing the
milk hauler’s work day results in unhappy milk wait time at the processing plant and other
haulers and impacts driver retention significantly delays [3].
[2].
3. Demonstration of Logistics Decisions using
2. Objective GIS
This paper demonstrates the use of In order to demonstrate, with some credibility,
Geographical Information Systems (GIS) with the application of GIS for logistics related
the real time information available via the new decisions, real data is used from The Alan
system to improve the logistics of milk Wilson Hauling Company, in Somerset,
collection and delivery. For example, consider a Kentucky, which collects milk from 388 farms
milk truck traveling interstate after collecting in Kentucky and processes the milk at The
milk from farms in Florida; it is transporting the Southern Belle Dairy in Somerset, Kentucky.
milk to a designated processing plant in
Kentucky. En route to Kentucky, real time 3.1 Basic Process of Milk Hauling
information on truck location, capacity, Daily schedules are provided to milk haulers
temperature of its contents, etc. is available and which define the cluster of farms that each milk
transmitted to a monitoring system in real time. hauler should visit to collect milk and take to the
If, owing to weather and traffic conditions, as processor. This cluster depends on the capacity
the truck approaches say, Atlanta, the milk of the milk truck and the amount of milk
temperature appears to be headed towards the expected from each farm. A typical truck for the
red zone (unacceptable region), adaptive above mentioned hauling company has a
decisions can be made in real time, with the GIS capacity of 50,000 gallons. At each farm the
based system, such as redirecting the truck to a milk is examined for any visible contamination
nearby processing plant with sufficient capacity and its temperature is checked to see if it within
in the Atlanta area. In a similar way, routing standards (i.e. < 40 deg. F). If the milk meets
and scheduling decisions can be adjusted based the required standards, samples are collected and
on real time information to meet other stored in refrigerated container on the truck and
objectives, such as minimizing travel times, and the milk is loaded into the truck. The volume of
wait times. Examples of adaptive scheduling milk collected from the farm is also noted. This
decisions include defining staggered release whole process at a farm takes about 20 minutes.
schedules of milk trucks for the collection of If the milk truck is not full, it goes to the next
milk, based on information of current truck farm on its route, otherwise it goes to the
locations, routes and collection times to processing station. At the processing station, the
minimize wait time at the unloading stations, samples are evaluated and the milk truck is
and redirecting a truck to a selected alternate checked for any signs of intrusion (currently this
farm, if the power is off at a particular farm. is done by examining the seals, with the new
system it would be through computerized
2.1 Preliminary analysis records that determine when and for what reason
the locks providing access to the milk are
To evaluate and demonstrate impacts, such as opened). If the milk samples from any farm fail
the wait time at processing plants based on the evaluation standards or if the seals are
dynamic scheduling decisions, a simulation broken then all the milk carried by the truck is
11
Proceedings of the 2008 IEMS Conference
rejected. The actual process time for unloading route from a starting location through the seven
the milk at the processing plant is around 40 potential milk farms, listed on Table 1, before
minutes. However, as mentioned before, milk proceeding to the milk processor. Seven farms
haulers spend considerable queue time “waiting are selected for demonstration purposes, since,
their turn” at the processing station. Traffic on average, seven farms are visited before the
conditions, inclement weather, insufficient milk hauler goes to the processor with a full tank.
at farms, all cause an increase in throughput time
for a milk hauler. This increases the probability 1. Producer – Marty Clark
of milk spoilage and unhappy milk haulers. Address- 4311 Knob- Lick wisdom
Road, Knob Lick- 42154.
3.2 GIS Application to Milk Hauling
Logistics: With the use of GIS a number of the
2. Producer – Daniel Adams
problems mentioned above can be mitigated.
GIS is capable of providing routes considering
Address- 1304 Milltown church Road,
traffic conditions, weather and road conditions. Milltown- 42761.
Also, since all the farm data is in the GIS
database, it can provide information to reroute 3. Producer- Amy Coomer
the milk hauler to another farm in the vicinity Address- 111 Long Meadow Drive,
owing to several reasons, such as, additional Greensburg- 42743
capacity available on the truck; the farm which
the hauler was scheduled to go to is taken of the 4. Producer- Hubert G. Wright
list of approved farms; the scheduled farm is out Address- 840 Elmo Russell Road,
of power, thus making milk collection Summersville- 42782
infeasible, etc.. Demonstrations of these
capabilities of GIS are provided in the following
sections.
5. Producer- Terry Tow bridge
Address- 5720 Hiseville park road,
3.3. GIS Setup Horse Cave- 42749
12
Proceedings of the 2008 IEMS Conference
3.5.2Time Window based Routing: Routing also defines the order in which the farms should
can also be based on “time windows”, where be visited.
trucks are scheduled to arrive at the different The components of this layer include the
farms within a specified time slot. Hence, with following:
the GIS tool a schedule and route is established 1. Origins feature layer: This layer stores
the network locations that are used as
so that milk haulers arrive at the farms within
potential origins or starting points.
the allotted time window. This is quite a 2. Destinations feature layer: This layer
common requirement, since haulers maintain stores the network locations that are
close relationships with farm owners and try to used as destinations.
satisfy their scheduling preferences. Table 2 3. Barriers feature layer: This layer is used
illustrates a time window based routing from to specify a point through which a route
Farm 1 to Farm 2, where Farm 1 is given a time is not possible.
4. Lines feature layer: This layer stores the
window of 8:00 a.m. to 8:20 and Farm 2 a time
results and the attribute table of the OD
window of 8:50 to 9:10 a.m. cost matrix.
3.4.1.3 Routing to the Nearest Facility: The Figure 5 illustrates an application of the O.D.
Closest Facility tool in Network Analyst can be cost martix layer to establish a depot.
used to find the route to the nearest facility.
This is necessary if the truck needs to go to the References
nearest allocated farm from another farm or 1. “Emission mandates” Bulk Transporter,
from any specified location. This need could September 2006, pp.22
arise if the hauler needs to visit another farm to
fill his tank, or if en-route to a farm a hauler 2. “Financial Plan-Market implications impact
transporters”, Bulk Transporter,
receives word that a designated farm is black September, 2006 pp. 24-25
listed or removed from the list of farms from
which milk is allowed to be collected. The 3. Janjirala, P., Sivaguranathan, S. Alexander,
S. M. and Evans, G.W., “Improving the
Closest Facility problem in Network Analyst,
logistics of milk collection via simulation”
involves finding the closest facility to a given Proceedings, Industry, Engineering and
location. The given location is referred to as Management Systems Conference, March
“incident” and the milk farms are referred to as 12-14, 2007, pp. 602-605.
“facilities”. The network analyst will define the
closest facility and also specify the best route.
Figure 4 illustrates such a route.
13
Proceedings of the 2008 IEMS Conference
14
Proceedings of the 2008 IEMS Conference
Figure 1 - Kentucky Farms Covered and by the Alan Wilson Hauling Company
15
Proceedings of the 2008 IEMS Conference
16
Proceedings of the 2008 IEMS Conference
17
Proceedings of the 2008 IEMS Conference
18
Proceedings of the 2008 IEMS Conference
Figure 5 – The use of the O.D. cost matrix layer to establish a depot location
19
Proceedings of the 2008 IEMS Conference
20
Proceedings of the 2008 IEMS Conference
21
Proceedings of the 2008 IEMS Conference
22
Proceedings of the 2008 IEMS Conference
4. Conclusions 5. References
An undergraduate project management [1] F.D.S. Choi, “Accounting Education for the 21st
business course could be taught and assessed for Century”, Issues in Accounting Education, 8, 1993,
student achievement solely in knowledge areas. pp. 423-430.
That same course could benefit from an
[2] R. Cornesky, Using Deming to Improve Quality
additional design element which applies the
in Colleges and Universities, Madison, WI: Magna
acquired knowledge areas. The additional Publications, 1992.
design element assessed by the instructor would
add incremental improvement to the student [3] R. Cornesky, Implementing Total Quality
achievement of project management knowledge Management in Higher Education, Madison, WI:
areas. Extending the design element by having Magna Publications, 1991.
faculty serve the role of consultant to a student
project designer by which the knowledge areas [4] D.L. Hubbard, “Can America Learn from
are applied further improves the understanding Factories?” Quality Progress, 25(5), 1994, pp. 93-
of knowledge areas. By having student designed 97.
projects assessed by external professional
[5] D.T. Seymour, “TQM on Campus: What the
project management practitioners, the students Pioneers are finding”, AAHE Bulletin, Washington,
get practitioner feedback for their design and D.C., 1991.
analysis thereby closing the loop of learning,
23
Proceedings of the 2008 IEMS Conference
24
Proceedings of the 2008 IEMS Conference
25
Proceedings of the 2008 IEMS Conference
E-Health
E-health is an emerging field in the
intersection of medical informatics, public
health, and business, referring to health services
and information delivered or enhanced through
Developing Nations – The Need for ICT the Internet and related technologies. In a
Infrastructure broader sense, the term characterizes not only a
How can catching up effect the technical development, but also a state of mind,
underdeveloped countries? For some of the a way of thinking, an attitude, and a
underdeveloped countries, the digital poverty is commitment for networked, global thinking, to
not a result from “naturally lagging behind”, but improve health care locally, regionally, and
is rather a direct result from the success of the worldwide by using information and
strong industrial nations. We all should agree communication technology (Eysenbach, 2001).
that the discourse on digital divide has become a Telemedicine may be as simple as two health
debate of contrasting opinions, but the reality is professionals discussing a case over the
that governments and private sector in the telephone, or as complex as using satellite
underdeveloped countries have to play a technology and video-conferencing equipment
decisive role to reduce the digital gap. to conduct a real-time consultation between
(http://www.acm.org/ubiquity/question/divide.ht medical specialists in two different countries.
ml) Telemedicine generally refers to the use of
The World Employment Report 2001 communications and information technologies
emphasizes that literacy and education cannot be for the delivery of clinical care.
by-passed, yet both are vital for reaping the
greatest advantages from the emerging digital Clinical uses of e-health technologies:
era. One of the greatest challenges facing • Transmission of medical images for
underdeveloped countries today is the promotion diagnosis
and implementation of educational programs • Groups or individuals exchanging health
that include literacy and particularly digital services or education live via
literacy. It is obvious to conclude then that the videoconference
digital divide must then be fought on at least two • Transmission of medical data for
battlefields: economy and education. With diagnosis or disease management
universal access and an Internet-literate • Advice on prevention of diseases and
workforce, the digital revolution can serve as the promotion of good health by patient
engine of economic growth and development. monitoring and follow up
• Health advice by telephone in emergent
ICT and Socio-Economic development cases
Any combination of the aforementioned ICT
would certainly benefit the peoples of the By helping build the necessary infrastructure
developing nations. There is no magic bullet, no such as satellites and procuring computers,
easy answer. There exists only one certainty: if developing countries would gain access to all
the digital divide is not addressed and bridged types of information including heath and
through Internet connectivity, and if digital
26
Proceedings of the 2008 IEMS Conference
education on health. According to the former staffs have left for better-paid jobs in the private
UN Secretary-General Kofi, “Too many of the sector or abroad because the work is exhausting
world’s people remain untouched by this and the pay is low.
revolution (Mutume, 2007).” With regard to the benefits of e-medicine, the
Healthcare in under developed countries, ability to head off pandemics resides at one
mostly in Africa, continues to be a big challenge extreme, while simple acts like allowing a child
for the World Healthcare Organization(WHO). to survive a treatable illness or improving life
In addition to so many diseases, a shortage of for an asthmatic sit on the other extreme; no
doctors and medical resources has only matter how an individual from the developed
worsened the healthcare problems in under world feels about this matter, they stand to only
developed countries. For example, Rwanda has a benefit from an improvement in global
population of 10 million people and only 500 medicine. Putting aside all humanity for a
doctors total to provide medical care for moment, e-medicine could accomplish what all
everyone in that country. Cleary, with a ratio of the king’s horses and all the king’s men (i.e.
one doctor per 20,000 people, most of the UN, Red Cross, etc.) could not: reach any
Rwanda citizen will not get the opportunity to person, anywhere on the globe, and provide
consult with a doctor in their lifetime. According medical diagnosis and treatment
to the WHO report in 2006, the Americas, which recommendations, at the least. If cellular phones
account for just 10% of global disease burden, become universal, as hoped, then the cameras
have 37% of the world’s healthcare workers and found on most models could greatly aid in
spend more than 50% of the world’s health diagnosis, for example.
budget (World Health Organization, 2007).
On the contrary, Africa has 24% of the E-Governance
disease burden but just 3% of the health care Medicine is not the only aspect of life that
workforce and accounts for less than 1% of the will be improved as digital inclusion spreads.
health spending (Kumar, 2007). Consequently, Governance will rise to the top. As soon as
many people living in under developed countries people actually have a voice, or a means of
are dying with diseases that could have been communicating over distance, they will begin to
easily prevented and cured if there were speak for themselves. E-governance, as it is
adequate doctors available to treat them. To called, allows for individuals from rural, isolated
make matters worse, most of the doctors in areas to have a say in government. “E-
under developed countries usually leave their governance has to be viewed as a political
homeland to work in an industrialized country in process dealing with reform of governance
search of a higher salary and political stability. towards good governance. And governance
Manuel Dayrit, director of the Department of reform is a slow process requiring engaging with
Human Resources for Health at the World institutions & bringing about attitudinal and
Health Organization, said, “Even if you have the constitutional changes,” according to Vikas Nath
medicine, the vaccines, and the bed nets, you (Nath, 2007).
need health workers to deliver the service E-Democracy, the end goal of the WSIS and
(Kumar, 2007).” Besides emigration of trained a form of e-governance, refers to the use of these
workers, underproduction and internal mechanisms in the democratic process. This
maldistribution in the medical field in Africa are usage varies greatly in form and content (van der
the other factors that contribute to the shortage Graft and Svensson, 2006). E-democracy can be
of health care workers in Africa. Most of the as simple as access by citizens electronically to
health care workers that stay in their homeland government information and services, or can be
do so only because they have a passion for more complex and involve the interaction
saving lives and healing their fellow citizens. between both the citizens and the government
Not only do these physicians have to deal with (Norris, 2007).
political instability in their country and low pay, Some components of e-governance are news,
they also have to work with extremely limited events, and political participation. As more and
resources. Workloads have soared as nursing more people become aware of, and included in,
27
Proceedings of the 2008 IEMS Conference
the world’s events, they will certainly express content will need to be created for different
their opinions and attempt to better things. communities, regions, and peoples.
Imagine: if the people of Darfur were not Distance learning is another example of how
separated by the digital divide, they would most e-education can elevate the world’s information
likely generate more global outcry over their poor. Naturally, issues such as literacy and
predicament. Digital literacy, along with the language barriers, in additional to the actual
necessary ICT infrastructure, grants individuals digital divide, will need to be addressed and
and groups the effectively participate in their remedied before this dream becomes a reality.
own governance. Having said that, distance learning offers the
possibility of a structured, western-style
E-Learning education to the globe’s poorest.
Another improvement made possible by ICT Conversing with other peoples and learning
infrastructure is electronic learning. “E-learning about distant places and cultures is another part
has been defined as the use of new multimedia of e-education. Education sows the seeds of
technologies and the Internet to improve the tolerance. Imagine a world where the poor
quality of learning, to make it accessible to children of Pakistan are no longer limited to
people out of reach of good educational madrasas run by religious teachers; if these
facilities, and to make new and innovative forms impressionable young minds could be taught
of education available to all,” according to tolerance and science, rather then religious
WSIS (World Summit on the Information dogma, the dividends would be incalculable.
Society, 2007). Relevant content is perhaps the This grand dream is within humanity’s reach.
most important aspect of e-learning, for the Wars and genocides, among other scourges, will
foreseeable future. Surely, most people agree become less common as different races and
that knowledge should be made available to all peoples learn about and from, and converse
of the world. However, information which with, each other. Hunter-gatherers in South
seems trivial to modern societies can make America know nothing of the situation in
massive impacts upon developing nations. Darfur, but through connectivity these two
Agricultural and scientific knowledge is isolated peoples could communicate and relate
capable of revolutionizing the Third World. If to one another. So much of the world’s strife is a
knowledge such as meteorological data, farming result of ignorance regarding others. Digital
techniques, livestock and crop husbandry, and inclusion tears down barriers and allows us to
common Western medical knowledge, for see ourselves in others, rather than someone who
example, were openly available to the world’s looks, speaks, worships, or lives differently.
poor, then their quality of life would greatly
improve. Starvation would be reduced, simple E-Commerce
illnesses could be avoided or mitigated, and less Finally, the last major facet of life which
conflict over food and other essentials would digital literacy and inclusion stands to improve
arise as a result of the dissemination of for all is commerce. Some individuals view
knowledge long considered common by commerce and profit with scolding eyes.
developed nations. However, trade and desire for wealth have
Content, is perhaps the most driven humanity to its current pinnacle, and will
underappreciated facet of e-learning. It is take it even higher. By connecting the remaining
absolutely correct that as much information 90%+ of the worldwide population, business
should be made available as is possible. gains a much larger market. One day in the
However, for peoples who are trying to catch up future, the entire world will constitute one
to, and join, the modern world, smaller steps market: an actual “global village.”
may be required. Rather than teaching rural The United Nations also addressed the topic
African children about the American Civil War of e-business. According to their Conference on
or other distant subjects, these same children Trade and Development (UNCTAD), business-
should be taught African geography and farming to-consumer (B2C) e-commerce is still only a
techniques tailored for their region. Local small percentage of all electronic commerce;
28
Proceedings of the 2008 IEMS Conference
29
Proceedings of the 2008 IEMS Conference
10) Livingston, Stephan. “Diplomacy in the 18) OECD. Understanding the Digital
New Information Environment (2003, Divide (2001). Cited 30 October 2007.
Feb 23).” Georgetown Journal of Available from
International Affairs 4, no. 2: 111. http://www.oecd.org/dataoecd/38/57/18
88451.pdf; INTERNET.
11) Marshall, J.M. “Learning with
Technology: Evidence that Technology 19) Oguya, Vivienne. “Bridging Africa’s
Can, and Does, Support Learning.” May Digital Divide: A Case for CD-ROM
2002. San Diego, CA: San Diego State Technology,” Quarterly Bulletin of the
University. International Association of
Agricultural Information Specialists 51,
12) Matteo, Sonina. “Philly Pushes for Low- no. 3 (2006): 161.
Cost Wi-Fi for Its Poorest Residents,”
Network World 24: 30, 6 August 2007, 20) “Race is on to lay undersea fiber optic
40; and “Houston Mayor White Names cable on eastern Africa coast,”
New Digital Inclusion Project Director,” Technology Review, 1 June 2007
U.S. Fed News Service, 12 March 2007. [electronic article] [cited 10 November
2007]; available from
13) Mutume, Gumisai. “Africa Takes on the http://www.technologyreview.com/Wire
Digital Divide.” Cited 2 December /18814/; INTERNET.
2007. Available from
http://www.un.org/ecosocdev/ 21) Ribble, Mike & Gerald Bailey. “Digital
geninfo/afrec/ vol17no3/173tech.htm; Literacy.” Cited 3 December 2007.
INTERNET. Available from
http://www.educ.ksu.edu/digitalcitizens
14) Naisbitt, John. “The New Media Age: hip/Education.htm; INTERNET.
End of the Written Word?” The Futurist
41 no2 (March/April 2007. 22) Rowe, Julie. International Computer
Driving License (ICDL):International
15) Nath, Vikas. In Digital Governance.Org Digital Literacy Takes Hold. January
Initiative [website] (London, U.K. [cited
2004; available from
10 November 2007]). Available from
http://216.197.119.113/ http://www.certmag.com; INTERNET.
artman/publish/index1.shtml;
23) Trippe, Bill. “Content Management
INTERNET.
Technology: A Booming Market.”
Econtent ((2001, Feb-Mar.), 20-21.
16) Norris, Donald F. “E-Government and
E-Democracy at the American 24) United Nations Conference on Trade
Grassroots.” International Conference and Development. E-Commerce and
on Public Participation and Information Development Report 2004. New York,
Technologies. [cited October 25, 2007]. 2004: 2: 4.
Available from http://www.umbc.edu/
mipar/documents/- 25) van der Graft, Paul, and Jorgen
EGovernmentandEDemocracy.pdf Svensson. “Explaining eDemocracy
development: A quantitative empirical
17) OECD. Digital Divide Report 2000. study.” Information Polity 11 (2006):
Retrieved October 30, 2007. Available 123-134.Wade Roush and David Talbot,
from “Beaming Books,” Technology Review
http://www.oecd.org/dataoecd/56/14/39
109, no. 2 (2006).
204469.doc.
30
Proceedings of the 2008 IEMS Conference
31
Proceedings of the 2008 IEMS Conference
• Creating easy-to-do business with • They recruit, hire, train and promote
delivery systems for service.
• Training and supporting employees • They market service to their
• Involving and empowering customers.
employees • They market service internally to their
• Recognizing and rewarding good employees.
performance and celebrating success • They measure service internally and
• Setting the tone and leading the way externally and make those results
through personal example” available.
(Connellan, Zemke, p. 6).
Driving customer loyalty is not about
Delivering high quality customer service coming up with new marketing messages or
must be a part of everyday performance. jumping through hoops when things go
Customer surveys are a critical part of wrong; it is about creating positive
understanding customer needs. Successful experiences for customers through an endless
organizations have one common central string of needs understood and delivering on
customer-focus. Once good performance is those needs. Turning customers into loyal
delivered consistently, you must improve. advocates is essential for business survival.
Ongoing improvement is equally important. The
Plus One Percent Rule is vital to a company’s Customer surveys are essential in helping
on-going concerns. “The Plus One Percent us target what our customers value most and
Rule is to keep you moving ahead and focused help us align our people and processes to
beyond your vision” (Blanchard and Bowles, p. consistently deliver on those values.
114). All you need to do is improve 1% each “Customer surveys should always answer the
week; if you can do that you are ahead 52% in a questions of who, what, where, why, when
years’ time. “You can make big changes in and how. With these answers we can better
almost anything or achieve great things in your understand our customer’s needs”
life by improving or changing 1%. Things can’t (Willingham, p. 49). The following are some
help but improve if you keep it at 1% each time” key areas we focused on to providing good
(Blanchard and Bowles, p. 117). Progressive customer service.
improvement is necessary at any level of
achievement. It guarantees positive change. • Communication- Do you make it easy
for the customer to communicate? Is
2. Background the phone answered promptly and are
“The process of reorienting a large inquiries handled properly. Do people
organization toward its market is analogous to know where to call? Is the customer
trying to teach an elephant to dance. Many of satisfaction survey used to confirm
the same challenges are involved” (Albrecht and that the staff is perceived by the
Zemke, p. 169). Organizations that have customers as being helpful, courteous
achieved excellence in service share the and knowledgeable?
following characteristics: • Making it Easy and Pleasant- Can the
• They possess a strong vision and customers find what they need when
strategy for service they need it and is there sufficient
• They practice visible management information and help on hand to
• They “talk” service routinely explain how the services/software
• They have customer friendly service works. Are we listening to our
systems. customer’s concerns? Are we
• They balance high tech with high knowledgeable about their business?
touch i.e. they temper their systems • The right quality products for the
with the personal factor. value of money – Not only should
32
Proceedings of the 2008 IEMS Conference
the quality of the service that you people quote the time elapsed since the
provide be measured, but you should unfortunate incident.
check that the products and services • You’re Not Going to Believe This –
that you market are what the customer Those abused by poor customer service
wants and closely match their just can never seem to accept the fact
expectations. Do your customer’s that it happened. They remain shocked
equate your business with securing and continue to agonize.
value for their money? • No Return, No Deposit – Rarely does a
• Speed and Attention- Customers want complaining customer indicate that he
to be dealt with quickly, but or she would return to an offending
attentively. Are we doing everything store.
we can to avoid delays? Good • Free Advertising – The Kind You
businesses try to deal with each Don’t Want – Customer will tell their
customer individually. A quick family members, friends and coworkers
resolution is necessary for every about a bad customer service incident.
inquiry. • Hell Hath No Fury Like a Customer
Scorned – All of the Principles of Bad
The Customer Value Management (CVM) Customer Service could be summed up
approach is a starting point for any analysis, in this simple phrase” (Friedman, p.
which is an outside-in customer vision of the 148).
firm as their ideal provider. “Becoming
customer centered with CVM starts by “The battle to attract and retain loyal
identifying the customers and securing their customers, given today’s technology, will be
visions of ideal outcomes from doing business fought and won based on the ability to
with the company. This includes: identify potentially profitable current and
• How the customer perceives value or future market segments and then meet their
benefit from interactions with company specific needs” (Thompson, p. 52). “Most
products. firms categorize their customers into groups
• What is the optimum level of value that based on the types and volume of products
the customer can envision? they use, not on their margin of profit or on
• What are the attributes of an ideal the basis of common needs and wants”
vendor and of an ideal value delivery in (Thompson, p. 53. Businesses must ensure
order to influence buyer behavior, they mine and analyze their data to better
loyalty and growth” (Thompson, p. understand customers at an individual level,
115). group them into profit and needs based micro-
segments, and access the information for
“The 7 principles of Bad Customer personalized service delivery.
Service” on the other hand state:
• They’re Grateful for the Chance To The ideal customer service survey will
Vent – Customers are always grateful perform the following three functions:
for the opportunity to tell how they • Market Research – provides valuable
were mistreated. feedback to help improve customer
• Tomorrow’s Joke – Many people joke satisfaction levels
to vent their frustration about their last • Marketing – promotes aspects of your
bad customer service experience and business
tell these “jokes” to anyone who will • Advertises a service that you provide
listen! that your customers may not have been
• The Memory of an Elephant – aware of
Customers often don’t forget. Lots of
33
Proceedings of the 2008 IEMS Conference
34
Proceedings of the 2008 IEMS Conference
35
Proceedings of the 2008 IEMS Conference
10. Recommendation
Other tests and surveys need to be
performed in order to gain better insight into
the phenomena surrounding the value of
dedicated account teams. This study presents
an initial indication that dedicated account
teams may have value for mega sized clients,
but empirical testing and data is needed to be
able to achieve more confidence that this is in
fact the case.
References
[1] Albrecht, K., Zemke. R., Service America, Dow
Jones Erwin, New York, 1992.
[2] Blanchard, K., Bowles, W., Raving Fans,
William Morrow and Co., New York, 1993.
[3] Brackett, T., Peterson, T., Britt, B., Ignelzi, R.,
and Fitzgerald, K., Customer Service: Dedicated
Account Teams, student research paper at
Jacksonville University, Spring 2006.
[4] Connellan, T., Zemke, R., Sustaining Knocks
Your Socks Off Service, American Management
Association, New York, 1993.
[5] Friedman, Nancy, Customer Service Nightmares,
Crisp Publications, Menlo Park, California, 2003.
[6] Gronstedt, Anders. The Customer Century,
Routledge, New York, 2000.
[7] Harvard Business Review on Customer
Relationship Management, Harvard Business School
Press, Boston, 2002.
[8] Lind, D., Marchal, W., Wathen, S., Basic
Statistics for Business and Economics, McGraw Hill,
New York, 2003.
[9] National Press Publications, Exceptional
Customer Service, Shawnee Mission, Ks, 2000.
[10] Rosenbluth, H., McFerrin, D., (1992) The
Customer Comes Second, William Morrow and
Company, New York, 1992.
[11] Thompson, Harvey, (2000) The Customer
Centered Enterprise, McGraw Hill, NY, 2000.
[12] Willingham, R., Hey, I’m The Customer, R.
Willingham, Prentice Hall, Paramus, NJ, 1992.
36
Proceedings of the 2008 IEMS Conference
37
Proceedings of the 2008 IEMS Conference
• Conducting system wide reviews large degree”. The adjective descriptors were
• A pre-assessment audit converted to numerical values and the average
Organizations seeking certification may not use ratings were evaluated.
all of these activities and employ these activities Figure 1 shows the ratings of the
to varying degrees. implementation activities sorted from highest
usage to lowest usage. Examination of the data
Data Collection and Demographics concerning the ISO 9000 implementation
activities indicated that two activities; the use of
The data for this study was collected external consultants and conducting a pre-
over a twelve week period using a survey form assessment audit were the implementation
on the World Wide Web. Survey invitations activities excluded most often. The use of
were e-mailed to members of the Benchmarking external consultants had the largest variation in
Exchange and placed in two consecutive issues rating from respondents.
of the Quality Digest electronic newsletter. A statistical test (paired t-test) was used
The number of responses totaled 153 to determine if there was truly a difference in the
which is relatively small compared to the ranking of the implementation activities. Table
number of people who received the e-mailed 1 shows the differences between the average
invitations and compared to the 150,000 ratings of the implementation activities. The last
organizations that are ISO 9001 registered. Use of Implementation Activies
Based on the number of respondents, this allows 70%
50%
10%
53%, 10 yrs or more – 47%) and internal (52%) Very Large Degree Large Degree Medium Degree
Small Degree Very Small Degree Not Done
vs. external (48%) reasons for seeking
certification. The data should not be biased by
an overrepresentation of respondents in any of Figure 1: Use of ISO 9000
these categories. In addition, the data also implementation activities
represents a mix of working level (65%) and row of the table contains the differences between
senior managers (35%) respondents as well as the average rating for ‘use of external
quality system personnel (55%) vs. other consultants’ and all of the other implementation
respondents (45%). In the sample, 58% were
3yrs or less Mean
Comparisson of Newness of Certication 4 - 9yrs Mean
10yrs or more Mean
from manufacturing industries versus 42% for 4.50
4.00
all others. Finally, the survey represents 70% 3.50
Average Rating
3.00
respondents from the United States and 30% 2.50
2.00
Management
Commitment
Preassessment
Documentation
Management
Procedures
Consultants
Reviews
System
Campaign
Awareness
Quality
System
Useful
Audit
38
Proceedings of the 2008 IEMS Conference
Management Commitment
System Useful Mgt. System Preassessm Awareness Mgt. System External
Manual Procedures Commitment Reviews ent audit Campaign Doc. System Training Consultants
Documentation System
Awareness Campaign
External Consultants
Preassessment Audit
Useful Procedures 0.521
Useful Procedures
0.000
System Training
System Reviews
System Manual
Mgt. Commitment 0.383 0.284
0.000 0.002
39
Proceedings of the 2008 IEMS Conference
Organizations that are now following in their small- and medium-sized enterprises in
footsteps are taking those lessons to heart. Even Singapore International Journal of Quality &
small businesses can benefit from this same Reliability Management, Vol. 15 No. 5, 1998,
advice. pp. 489-508
References [10] Singels, J., Ruel, G., and van de Water, H.,
(2001) ISO 9000 series Certification and
[1] Briscoe, J. A., Fawcett, S. E. and Todd, R. H. performance International Journal of Quality &
(2005) The Implementation and Impact of ISO Reliability Management, Vol. 18 No. 1, 2001,
9000 among Small Manufacturing Enterprises pp. 62-75.
Journal of Small Business Management 2005
43(3), pp. 309–330 [11] Smith, B., (1994) A Proven Path to ISO
9000 Registration, Industrial Distribution; Jul
[2] Callaghan, N., and Schnoll, L. (1997) 1994, 83,7, pg. 50
Quality Digest
http://www.qualitydigest.com/aug97/html/cover. [12] Terlaak, A. (2002) Exploring the adoption
html process and performance consequences of
industry management standards: The case of
[3]Haversjo, T., (2000) The financial effects of ISO 9000, University of California, Santa
ISO 9000 registration for Danish companies Barbara
Managerial Auditing Journal 15/1/2 [2000] 47-
52
40
Proceedings of the 2008 IEMS Conference
41
Proceedings of the 2008 IEMS Conference
42
Proceedings of the 2008 IEMS Conference
CJMTK serves as the basis for the decision can be proposed for that area. Furthermore, if
making capabilities being developed for the mission contains survey plans for several
hydrographic surveys. areas, the results of each line method for each
area can be compared, and a collection of
4. AutoSurvey™ appropriate methods can be obtained to provide
the most efficient approach for the entire
The ASP [3] is a mission planning system for mission. Table 1 shows an example report to aid
hydrographic surveys that provides the in the determination of this collection of
capability to estimate the most efficient set of methods.
track lines for a given area. It is built on top of a
library called AutoSurvey™, which provides a Area 1 Area 2 Area 3
set of environmentally adaptive line running Adaptive
methods that take information from a current Parallel
track line and then predict what the track of the Time Inside 218.61 208.88 561.66
next line should be. Time Outside 11.66 8.33 6.38
AutoSurvey™ provides three methods of line Total Time 230.27 217.22 568.05
manipulation: the Adaptive Parallel (AP)
method, the Linear Regression (LR) method, Linear
and the Piecewise Linear (PL) method. Fig. 2 Regression
shows an example of each method.
Time Inside 215.00 203.05 717.77
Time Outside 11.94 8.05 17.7
Total Time 226.94 211.11 735.55
Piecewise
Linear
Time Inside 197.29 195.93 351.18
Figure 2 – (From left to right) The AP method, the
LR method, and the PL method. These are Time Outside 19.77 19.84 31.06
estimated track lines for each method covering the Total Time 217.06 215.77 382.25
same area.
Most Efficient PL LR PL
The AP method is a simple method that Table 1 – This table shows survey estimations for
returns a series of straight, parallel lines whose three areas, and for each line method in that area.
spacing is based on the minimum depth along The time is measured in hours, the distance in
the previous track. kilometers.
The LR method returns a series of straight
lines as well, but the constraint of being parallel Typically, only bathymetry data, the areas to
no longer exists. The spacing and azimuth of be surveyed and predefined sensor parameters
the line are determined by the best fit to the data are used by the ASP. Additional data and
previous track’s swath width. parameters could further enhance the capability
The PL method returns a set of lines that is provided to mission planners. Furthermore, the
not necessarily parallel or straight. The track ASP itself does not contain a true GIS
line is segmented to get the absolute best fit to component. The system is fed a series of files
the edge of the previous track’s swath edge. In that define the area and the bathymetry, with
most cases, this method returns the most coordinates in latitude and longitude. These
efficient estimates. coordinates are converted into Universal
The ASP alone is a useful decision aid. One Transverse Mercator (UTM) and then converted
of the key advantages to the ASP is that it back after being run through the ASP. By
provides these multiple approaches to running coupling this system with the CJMTK, it is
lines. By comparing the results of each method possible to build a system that not only provides
for a given area, the most efficient line method the adaptive line methods of the ASP, but also
43
Proceedings of the 2008 IEMS Conference
allows geospatially coordinated environmental display, selecting and clearing features, and
data to be used in the hydrographic mission measurement functions. All of these tools are
planning effort. standard ESRI components.
The environment toolbar is a collection of
5. The Hydrographic Mission Planner custom tools that deal with adding and
manipulating environmental data within a
The Hydrographic Mission Planner (HMP) is project. It allows a user to initialize the
being developed to combine the capabilities of geospatial reference of the display, sets the area
the ASP and the CJMTK to better support the of interest for the mission, and provides access
decision making efforts of hydrographic survey to a variety of data including bathymetry,
planners. The HMP is designed to be an currents, and Digital Nautical Charts (DNC).
extension to the Mission Planning Mission Also, the tools that allow the creation and
Monitoring (MPMM) system, which is also modification of survey areas and track lines
being developed by the Naval Research Lab [4, using the non-ASP approaches are located on
5]; however, the initial prototype is utilizing the this toolbar.
ArcGIS Desktop as authorized by the CJMTK The AutoSurvey™ toolbar provides a way to
license agreement. Figure 3 shows the current interact with the ASP from within the GIS. It
state of the HMP prototype. includes the ability to start/kill the Cygwin
process, a Linux runtime application required by
any machine running AutoSurvey™. Also
available is a Bash interpretive shell for
interaction with the Cygwin environment.
The remainder of the commands on this
toolbar provides access to standard ASP dialogs.
These dialogs provide the ability to create, edit,
simulate, and generate results for the adaptive
line running methods.
44
Proceedings of the 2008 IEMS Conference
45
Proceedings of the 2008 IEMS Conference
the TLA tools are part of the MPMM project; public release, distribution is unlimited. NRL
however, they are very important to contribution number NRL/PP/7440-08-1003.
hydrographic survey planning. Because of this,
the tools will be implemented as part of the 9. References
HMP prototype.
For example, if a survey is to be conducted [1] International Hydrographic Organization,
only in water deeper than 200 meters, then TLA “IHO Standards for Hydrographic Surveys.
on the bathymetry would provide a new polygon 4th Ed.”, Special Publication 44, April 1998.
depicting the unacceptable areas for surveying. [2] http://www.cjmtk.com
Figure 6 illustrates such a scenario. [3] D. Brandon, B. Bourgeois, “AutoSurvey:
Efficiency and Autonomy in Surveying”,
Hydro International, vol. 7, No. 10,
December 2003.
[4] B. Bourgeois, “Mission Planning Software
Support for Underwater Vehicles”, Industry,
Engineering, & Management Systems 2008,
March 2008.
[5] M. Roe, B. Bourgeois, A. Morris,
“Unmanned Underwater Vehicle Mission
Planning and Mission Monitoring
Figure 6 – This figure illustrates a TLA on an Prototype”, Industry, Engineering, &
area with a depth less than 200m. This is helpful Management Systems 2008, March 2008.
for defining valid operation areas.
7. Conclusions
8. Acknowledgments
46
Proceedings of the 2008 IEMS Conference
Has supply chain management lived up to its Very little information was obtained from
predicted promises? The answer is probably no customers or suppliers. In fact, prior to 1980
but considerable progress has occurred and there was very little momentum at attempting to
47
Proceedings of the 2008 IEMS Conference
determine the needs of customers or providing After World War II, the United States was the
demand expectations to suppliers. The predominant player on the world stage in terms
organization was expected to use forecasting of industrial might. Symbolic of this industrial
algorithms to predict sales while suppliers to the might was the United States automotive and
organization were expected to anticipate needs. steel industries. The CEOs in these industries
were unable to conceive of a future that would
Why did such an environment exist? Four include strong global competitors. An interesting
factors were the major contributors to this contrast of Ford and Nissan [4] and studies on
situation. certain industries [8] provide vivid insights to
such management thinking.
1. Functional excellence dominated management
thinking. 4. Importance of functional financial
accountability
Until the 1980s most organization theorists
thought that creating an organization where The audit environment at this time
functional excellence was dominant would result emphasized that each functional activity must be
in an excellent organization. Certainly higher prepared to defend its day-to-day decision-
education, consulting organizations, and making. Therefore, the distribution function
management gurus fostered such a belief. within an organization would have to show the
Certification also created the illusion that such same ordering process for receiving product
specialists must have specific knowledge not from a manufacturing plant as an outside firm.
necessarily shared by general management. This Audit trails were more important than
last point is important since it led to the coordinated product flow.
perception that executives responsible for
overall performance were not equipped to make The 1980s brought about a dramatic change to
decisions in some specialized areas of the firm. such thinking. First, Japanese firms emerged
especially in the automotive and steel industries
2. Lack of an integrating enabler using lean manufacturing as their mantra.
Inventory wasn’t something to be optimized; it
The latter part of the 20th century saw the was to be eliminated under the concepts of lean
emergence of information technology within the manufacturing. The Japanese did not have space
organization. The first application of such for warehouses for raw materials and finished
technology was the financial functions of the product. In-process inventory was considered an
firm. In the 1960s and 1970s, these functions eyesore and eliminated.
were computerized on large mainframe
computers. In the 1970s the emergence of mini- It became clear to both U.S. automotive
computers allowed the creation of networks companies they would have to change or face
amongst the various functional areas. decreased market share or extinction in the
Therefore, it was not until the 1980s that the marketplace. Lean approaches to manufacturing
coordinating mechanisms to allow the effective became commonplace within U.S. automakers.
and efficient transfer of information began to Their financial performance improved. Today,
exist. With the continuing advances of the General Motors, Ford, and Chrysler are still
information technology in the 1980s and 1990s, facing significant challenges in the marketplace
the integration of functions began to take place. but this is mainly due to long-term arrangements
Also, software suppliers began to realize that a with retirees and workers resulting in very
market for such applications existed. The result expensive pension and health care commitments.
was the emergence of specialized software for
logistics applications. Toyota [10] and an influential study on lean
production [16] became models for American
3. Lack of competitive environment firms. Interestingly, the approaches the Japanese
used such as Kanban did not easily transfer to
48
Proceedings of the 2008 IEMS Conference
49
Proceedings of the 2008 IEMS Conference
It is felt that many of the more sophisticated While there has been progress concerning
companies have yet to advance past stage two, information technology to support supply chain
link excellence. While many firms have excellence, much of the software that has been
embarked on the journey, they do not fully installed supports link excellence. Interfaces
comprehend what is necessary to achieve supply have been developed that have aided visibility
chain excellence. To move beyond stage 2, the but the entire view of the supply chain does not
firm must take deliberate steps organizationally exist in most organizations. As a result, link
to make supply chain excellence a dominant excellence is still the major objective. Firms like
strategy of the firm. Dell have made supply chain visibility a priority.
50
Proceedings of the 2008 IEMS Conference
51
Proceedings of the 2008 IEMS Conference
52
Proceedings of the 2008 IEMS Conference
53
Proceedings of the 2008 IEMS Conference
54
Proceedings of the 2008 IEMS Conference
successful security management depends on the can compromise password security because
involvement of users and stakeholders that users are unable to remember passwords and
knowledge on information systems security therefore keep insecure records of their
issues may be lacking resulting in reduced passwords such as writing a password down on
participation. Organizations seek to retain paper [19].
knowledge in their operations but the extent to
the knowledge captured on security related Miller’s [9] Chunking Theory and Cowan’s [4]
topics is not necessarily handled with the same research is useful to consider regarding human
consistency and rigor. Carnegie Mellon’s and social factors in organizational security
Computer Emergency Response Team (CERT) specifically when developing a model for
[2] has collected statistics showing that six password guidelines. This theory classifies data
security incidents were reported in 1988 in terms of chunks and indicates that the
compared to 137, 529 in 2003. Furthermore, capacity of working memory is 7±2 chunks of
CERT [2] reported that 171 vulnerabilities were information. More recent research suggests that
reported in 1995 in comparison to 7,236 in 2007. a mean memory capacity in adults is only three
In addition, the Federal Bureau of Investigation to five chunks with a range of two to six chunks
(FBI) conducted a survey in which 40% of as the real capacity limit. A chunk of data is
organizations claimed that system penetrations defined as being a letter, digit, word or different
from outside their organization had increased unit, such as a date [9]. Similar to Miller’s
from the prior year by 25% [6]. Chunking Theory, Newell, Shaw, and Simon
[11] suggest that highly meaningful words are
Password Authentication easier for a person to learn and remember than
less meaningful words, with meaningful being
Research on passwords is necessitated in spite of defined by the person’s number of associations
a movement towards alternative security with the word. Research suggests that passwords
techniques, such as bioidentifiers, individual could be more memorable if users comprised
certificates, tokens and smart cards. Smart cards their passwords with familiar characters such as
communicate directly with the target system and phone numbers [15]. Memorizing a string of
run the authentication procedure themselves. A words that represent complete concepts is easier
survey of 4,254 companies in 29 countries was to remember than an unrelated list of words
conducted by Dinnie [5] to identify a global suggests Straub [13]. A study was performed to
perspective of information security. The survey test passwords between five characters and eight
indicated that password authentication in the characters in length [12]. The research suggests
USA is the preferred security method utilized that increasing password character length to a
62% of the time as opposed to smart card minimum of six to eight characters reduces
authentication only being used 8% of the time crackability and therefore password strength in
and certificates 9% of the time. In Australia, terms of security. Another study conducted
password authentication is used 67% as opposed suggests that crack-resistant passwords were
to smart card authentication only being used 9% achieved through the use of a sentence
of the time and certificates 5% of the time. The generation password method including the user
remaining countries surveyed showed password to embed a digit and special character into the
authentication at 58% with smart card password [16]. Carstens, Malone and
authentication at 4%. A common challenge McCauley-Bell [3] performed a study at a large
faced by individuals today is the need to federal agency to determine whether a difference
simultaneously maintain passwords for many exists in people’s ability to remember complex
different systems in their work, school and passwords with different difficulty levels. The
personal lives. Research conducted by results suggests that as long as a password is
Wiedenbeck, Waters, Birget, Brodskiy and composed of meaningful information unique to
Memon [17] suggests that stringent rules for an individual, that human memory capabilities
passwords lead to poor password practices that enable an individual to recall up to four-chunks
compromise overall security. Human limitations of data consisting of up to 22-characters.
55
Proceedings of the 2008 IEMS Conference
56
Proceedings of the 2008 IEMS Conference
participants responded that of the passwords were asked a question regarding if their
forgotten, 54.2% created the forgotten password password has ever been changed to a previous
themselves, 37.5% were assigned the password password used in the past. There was no
and 8.3% were both created by themselves and response by 2% of the survey respondents.
assigned. However, those participants that did respond
answered that 51% of the time when their
Personal Passwords password was changed, the respondents changed
it to a password used in the past. When
The survey participants were asked to answer participants were asked how many times in the
questions regarding their personal passwords. past month has a password been forgotten, the
Collectively, the survey respondents indicated range was between one and four times a month a
having a range between 1 to 10 personal password is forgotten. Furthermore, survey
passwords each were responsible for participants responded that of the passwords
remembering. The average number of forgotten, 60% created the forgotten password
passwords participants had were 4.2 passwords. themselves, 33.3% were assigned the password
The difficulty level of the passwords that survey and 6.67% were both created by themselves and
participants had to remember consisted of 23% assigned.
of passwords comprised of only words, 19.9%
comprised of two or more word passwords, 5. Conclusion
5.7% comprised of unfamiliar (not meaningful) The world has been revolutionized by the
numbers, 13.5% comprised of familiar numbers amount of information that makes its way into
(street address, SSN, birth date, etc.) and 39.9% the daily lives of individuals and ultimately
comprised of a combination of numbers, organizations. It is therefore necessary that
symbols and letters. The data collected indicated research continues to identify specific ways that
that .44% of respondents had passwords assist individuals in handling the abundance of
containing 1-3 characters in length, 42.58% had information. Although there are many
passwords containing 4-6 characters in length, alternatives to password authentication,
44.25% had passwords containing 7-9 characters individuals have multiple passwords that are
in length and 12.83% had passwords containing used daily as password authentication is still
10 or more characters. considered to be the most common form of
authentication. Therefore, an issue for
Participants were asked in the survey if their information security personnel is to reduce the
passwords are saved on their computer to enable information load faced by individuals trying to
participants to not have to retype in their maintain numerous passwords for recall. Future
passwords when logging in to websites. The research in password authentication practices
participants responded that 51.1% of their should continue to explore the human and social
password is saved on their computer and 44.9% factors in organizational security.
of the participants do not save their password on Organizational and individual password usage as
a computer. Survey participants when asked if new technology emerges will likely increase the
their passwords are written down on paper for memory demands placed on individuals daily.
easy recall of passwords answered that 18.4% of Research will need to be continuously conducted
participants do write their passwords down on to keep a pulse on the password demands being
paper for recall purposes and 81.6% of placed on individuals to identify techniques that
participants do not write passwords on paper. assist humans with the management of their
When responding to the question regarding how passwords. Determining the links between
often their passwords are changed, survey password issues and other newly identified
respondents answered the same as reported for issues on human memory limitations and
the work and school passwords which were strategies to reduce the potential for
6.3% answered monthly, 22.9% answered vulnerabilities produced will be crucial for
annually, 62.5% answered never, 8.3% answered organizational success. Until passwords are no
“other” and 2% did not respond. Participants longer in use, it is important that practitioners
57
Proceedings of the 2008 IEMS Conference
and researchers continue their efforts as a [10] National Institute of Standards and Technology
building block for future research that focuses (NIST). (1992). Computer System Security
on the human side of information security and andPrivacy Advisory Board (Annual Report), 18.
specifically the human and social aspects of
[11] Newell, A., Shaw, J. C., & Simon, H. (1961)
password authentication.
Information Processing Language V Manual.
Edgewood Cliffs, NJ: Prentice-Hall.
6. References
[12] Proctor, R.W., Lien, M.C., Vu, K.P.L., Schultz,
[1} Belsis, P., Kokolakis, S., & Kiountouzis, E.
E.E., & Salvendy, G. (2002). “Improving computer
(2005). “Information systems security from a
security for authentication of users: Influence of
knowledge management perspective”, Information
proactive password restrictions”, Behavior Research
Management & Computer Security, 13(3), 189-202.
Methods, Instruments, & Computers, 34, 163-169.
[2] Carnegie Mellon Computer Emergency Response
[13] Straub, K. (2004). “Cracking password usability
Team (CERT). (2007). Computer emergency
exploiting human memory to create secure and
response team statistics. Retrieved March 25, 2008
memorable passwords”, UI Design Newsletter,
from
Retrieved March 2, 2008, from
http://www.cert.org/stats/cert_stats.html#incidents
http://www.humanfactors.com/downloads/jun04.asp
[3] Carstens, D. S., Malone, L., and Bell, P. (2006).
“Applying Chunking Theory in Organizational
[14] U.S. Department of Homeland Security.
Human Factors Password Guidelines”, Journal of
(2002). Federal information security management act,
Information, Information Technology, and
Retrieved March 2, 2008, from
Organizations, 1, 97-113.
http://www.fedcirc.gov/library/legislation/FISMA.ht
[4] Cowan, N. (2001). “The magical number 4 in ml
short-term memory: A reconsideration of mental
[15] Vu, K.P.L., Bhargav, A., & Proctor, R.W.
storage capacity”, Behavioral and Brain Sciences,
(2003). “Imposing password restrictions for multiple
24(1), 87-185.
accounts: Impact on generation and recall of
[5] Dinnie, G. (1999). “The second annual global passwords”, Proceedings of the 47th Annual Meeting
information security survey”. Information of the Human Factors and Ergonomics Society, USA
Management & Computer Security, 7(3), 112-120. (pp. 1331-1335).
[6] Ives, B., Walsh, K., & Schneider, H. (2004). [16] Vu, K.P.L., Tai, B.L., Bhargav, A., Schultz,
“The domino effect of password reuse”, E.E., & Proctor, R.W. (2004). “Promoting
Communications of the ACM, 47(4), 75-78. memorability and security of passwords through
sentence generation”, Proceedings of the Human
[7] Lewis, J. (2003). “Cyber terror: Missing in Factors and Ergonomics Society 48th Annual
action”, Knowledge, Technology & Policy, Meeting, USA (pp.1478-1482).
16(2), 34-41.
[17] Wiedenbeck, S., Waters, J., Birget, J.C.,
Brodskiy, A., & Memon, N. (2005). “PassPoints:
[8] McCauley-Bell, P.R., & Crumpton, L.L. (1998).
Design and longitudinal evaluation of a graphical
“The human factors issues in information security:
password system”, International Journal of Human
What are they and do they matter?”, Proceedings of
ComputerStudies, 63, 102-127.
the Human Factors and Ergonomics Society
42ndAnnual Meeting, USA (pp. 439-442). [18] Wood, C.W., & Banks, W.W. (1993). “Human
error: an overlooked but significant information
[9] Miller, G.A. (1956). “The magical number seven security problem”, Computers & Security, 12, 51-60.
plus or minus two: Some limits on our capacity for
[19] Yan, J., Blackwell, A., Anderson, R., & Grant,
processing information”, Psychological Review, 63,
A. (2004). “Password memorability and security:
81-97.
Empirical results”, IEEE Security and Privacy, 2(5),
25-31.
58
Proceedings of the 2008 IEMS Conference
The New Approach Based on RSM and Ants Colony System for
Multiobjective Optimization Problem: A Case Study
Abstract The ACS had been also used in the pass for the
optimization problem and proved that it is a
1. Problems and questions addressed powerful technique for optimization problem.
Today's industrial environment is very In this paper, a hybrid approach combined the
competitive, and requires shorter manufacturing RSM and an extension of the ACS in continuous
lead times, lower costs, and higher quality fields is proposed for solution of the continuous
products. These requirements produce complex MOO problem. The RSM modelling is used to
problems characterized by multiple objectives as determine the objective functions and the ACS is
well as complex design objective and constraint then applied to search the (optimum) solution..
relations. From high number of variables, The applications of the proposed approach
multiobjective optimization (MOO) becomes proved that the hybrid approach requires less
complex and very expansive to obtain the time to obtain the optimum solution and, also,
solution, particularly when the computational the quality of the solutions is better
time is taken into account. comparatively to others techniques developed in
this domain.The proposed hybrid algorithm
Generally, the MOO problems are however consists of seven steps and shown in the figure
difficult to solveNumerical optimization is one 1.
of the searching methods often used to
determine the best solution of the MOO 3. Results and conclusions reached
problems. Several algorithms applications have
been developed for single-objective. However, To demonstrate the high potential of the
given the complexity and the multiple objectives proposed approach, a case of multistage flash
of the MOO problems, currently found today in (MSF) desalination process has been
the manufacturing field, the researchers are now investigated. That is an optimization problem
more on the optimization algorithms for multiple with two performance objectives: maximization
objectives and multiple constraints, an of water production (distillate produced: DF)
optimization problems with high degree of and minimization of blow down flow (BDF)
complexity. rates...That is an industrial process consisting to
reheat the ocean water and to evaporate/to
2. Work performed condense a part to obtain the soft water (no salt).
The case study investigated in the paper is a
The response surface methodology (RSM) had three stage MSF operation process.. There are
been used in the pass into the optimization five (5) factors in the input of the process which
process to reduce the computation time. are determining and principally influencing on
However, RSM needs a lot of independent the values of two performance objectives of the
variables to construct a proper response surface. process in the output. The process had been
The Ants Colony System (ACS), a metaheuristic modeled with the RSM to establish the
inspired from the studies of Ants behaviour is mathematical model and the ACS had been now
considered as a multi-agent approach for applied (to the mathematical model) to optimise
resolving combinative optimization problem. and to obtain the best solution. The problem had
59
Proceedings of the 2008 IEMS Conference
been simulated in Matlab and the programming powerful and better comparing with the others
execution shows that after 260 iterations (figure techniques, in terms of quality and robustness of
2), the best solution is obtained and the results solution.
show that the proposed hybrid approach is really
AT OPTIMAL
RESULT :
RSM − SIMUL
pe
SIMUL
IF ITERATION
NUMBER EQUAL TO
GIVEN NUMBER OF
ITERATION
0.98
0.96
Fitness fonctionevaluation
0.94
0.92
0.9
0.88
0.86
0.84
0.82
0.8
0 50 100 150 200 250 300
Iteration index
60
Proceedings of the 2008 IEMS Conference
61
Proceedings of the 2008 IEMS Conference
patients from the ED due to extended wait times. center began in 2000 and was recently
Eloping occurs at different stages of ED recertified by the State of North Carolina until
admittance. Commonly, the four stages include: 2009. Trauma surgeons are specially trained
1. leave pre-triage, 2. leave post triage, 3. leave general surgeons. The trauma team coordinates
post RN - pre MD exam and 4. Against medical different medical specialties, including, but not
advice (AMA) – leave post MD exam. Both of limited to, neurosurgery, orthopedic surgery,
the above described factors (bed holding and plastic surgery, oral and maxillofacial surgery,
eloping) can result in lower hospital income and physical therapy, occupational therapy, speech
the loss of an intangible asset - the hospital therapy and rehabilitation services to optimize
reputation. This study utilizes overall ED patient care.
data collected at Moses Cone Hospital in North
Carolina from 2006 to 2007. Moses Cone 3. Data and Methods
Hospital, which has 535 beds, offers
comprehensive services in a wide range of Emergency department patient data collected
medical and surgical services. at Moses Cone Hospital over two years was used
for this study. For the year 2006 all twelve
2. Moses Cone Health System months were analyzed, while for the year 2007
data was used for the first 10 months. On an
Moses Cone Health System is a multi-hospital average the total patients attending emergency
system with a range of outpatient services department per year were around 65,000. Data
designed to provide care for Piedmont residents was further filtered to identify “Eloping”
throughout all stages of their lives. Residents of patients who left the ED without being seen and
our primary service area include neighboring treated at the hospital. In order to improve
four counties the city of Kernersville. More than efficiency of ED and reduce costs we identified
7,400 employees work throughout the Moses and evaluated non-value added activities within
Cone Health System and the outpatient services. different stages in elopement. The elopement
And as a teaching facility for internal medicine, data was analyzed on an hourly basis. Further
family practice, obstetrics, gynecology and elopement data was analyzed during peak
pediatrics, Moses Cone Health System gives periods by different stages of elopement.
patients access to the latest developments in
medical care from their first moments of life 4. Analysis of Elopement
through later years. In the year 2006 a total of 5974 patients
eloped from the ED while in 2007 a total of
The Moses H. Cone Memorial Hospital, the 6070 patients eloped from the ED. Depending
flagship of Moses Cone Health System, was on the level of service administered for patients
established in 1953 to serve the community by after their arrival to the ED, eloping commonly
delivering high-quality healthcare. Located on a has four categories. Patients leaving the ED
63-acre campus, this 535-bed hospital is the without being directed to triage belong to the
largest medical center in its four-county region. first category called as (1. leave pre-triage).
To ensure that the services are convenient for Patients leaving the ED after being directed from
patients, the hospital continues to invest in the triage to determine the level of patient care
updating and expanding its facilities. While needed belong to the second category termed as
continuing to grow, modernize and plan for the (2. leave post triage). Patients leaving the ED
future, Moses Cone Hospital has consistently after meeting with the nurse for initial check-up
kept its rates lower than those of other and before seeing the MD are categorized as (3.
comparable providers. The Moses H. Cone leave post RN - pre MD exam). Finally, patients
Memorial Hospital Level-II Trauma leaving the ED after diagnosis by MD against
Center serves the greater Greensboro medical advice are categorized as (4. Against
community by providing superior care to people medical advice (AMA) – leave post MD exam.)
who have sustained severe injuries. The trauma Eloping data was analyzed to observe seasonal
62
Proceedings of the 2008 IEMS Conference
trend patterns over the twelve months. We also (AMA). The length of waiting time in ED has an
analyzed data based on age groups and the four influence on the loss of patients at different time
eloping categories. Data was analyzed and periods and categories from the ED
plotted for both the years 2006 and 2007. Figure Patients Based on Type of Elopment
1 shows the monthly data for total number of 2500
patients.
2000
No. of P atients
1500
Monthly Total Patient Data
1000
7000
T o ta l n u m b e r o f p a t i e n ts
500
6000
0
5000 AMA - Leave post Leave post RN-pre Leave post-Triage Leave pre-triage
Year 2006 MD exam MD exam
4000
Year 2007 Type of Elopment
3000 2007 2006
2000
Figure 3. Elopement based on categories
1000
Jan FEB MAR APR MAY JUN JULY AUG SEP OCT NOV DEC
Months 5. Analysis for “Hourly Elopement Data”
Figure 1. Monthly data for total number of Figure 4 shows hourly elopement data (two hour
patients time period) for patients for months January through
May for year 2007. These four months were picked
Figure 2 shows the seasonal trend patterns for as they had higher number of eloping patients.
eloping patients. It can be seen that patients
eloping the ED increase in the spring months Total number of patients eloping on hourly basis
100
0
0 to 2 2 to 4 4 to 6 6 to 8 8 to 10 10 to 12 12 to 14 14 to 16 16 to 18 18 to 20 20 to 22 22 to 24
Eloping Patients Monthly Data
Two hour time period
800
Jan Feb March April May
700
600
No. of Patients
500
Figure 4. Hourly Elopement Data for 2007 (Two
400 hour increments)
300
200
100
Eloping Patients between 10am-24pm for JAN-MAY 2007
0
v
n
ne
ly
c
g
ril
ch
b
pt
ct
ay
No
Ja
Au
De
Fe
Ju
Ap
Se
O
Ju
220
ar
M
M
Month
N o . o f E l o p i n g P a ti e n ts
200
2007 2006
180
160
Figure 2. Seasonal trend for eloping patients at 140
ED 120
100
JAN
80
Figure 3 shows elopement based on different 60
FEB
categories of elopement. Out of the total eloping Pre Triage Post Triage Post RN AMA
MAR
APR
patients on an average 22% leave at the pre- Elopement Types
MAY
triage stage, 34% leave at the post-triage stage,
23% leave at post RN and pre MD stage, and
21% leave post MD against medical advice Figure 5. Eloping patient data for each stage
during peak period (10am – 24pm)
63
Proceedings of the 2008 IEMS Conference
8. References
Figure 5 shows the number of eloping patients
for different eloping category. As seen from the [1] Hospital Statistics (1903-1994 and Previous
graph the highest elopement occurs during post- Years) - Chicago, III: American Hospital
triage stage. For each type of elopement we Association; 1993.
identified possible reasons in consultation with [2] Grumbach K, Keane D, Bindman A. Primary care
ED staff and hospital administration. and public emergency department overcrowding.,
Am.J Public Health. 1993:83:372-377.
[3] McCaig LF. National Hospital Ambulatory
6. Recommendations Medical Care Survey: 1992 Emergency Department
Summary. Adv Dam Itial Health Stat. 1995; no. 245.
The loss of eloping patients directly impacts DHHS publication PHS 94-1250.
the revenue stream and reputation of the hospital. [4] Canadian Institute for Health Information:
In order to address these issues possible National Health Expenditure Trends, 1975–2004,
suggestions were recommended to the hospital. Ottawa 2004.
These include: [5] Health Canada: National Health Expenditures in
1. Hire additional PA and RN in order to Canada, Ottawa 1997
reduce the loss of patients in the earlier [6] McCaig L, Burt C: National Hospital Ambulatory
Medical Care Survey: 2001 Emergency Department
stages of the ED cycle during peak period
Summary. Advance Data from Vital and Health
(10 AM – Midnight) Statistics, Centre for Disease Control and Prevention,
2. Increase number of beds available for ED National Centre for Health Statistics, 2003 Report No
3. Create better procedures for the non-insured 335.
4. Use more experienced RN for triage [7] David T. Huang, Clinical review: Impact of
5. Improve the training of receptionists emergency department care on intensive care unit
costs, Critical Care 2004, 8:498-502.
7. Conclusion [8] Fromm RE Jr, Gibbs LR, McCallum WG, Niziol
C, Babcock JC, Gueler AC, Levine RL: Critical care
in the emergency department: a time-based study.
Emergency department revenues are hampered
Crit Care Med 1993, 21:970-976.
by loss of patients at different stages of the [9] Nguyen HB, Rivers EP, Havstad S, Knoblich B,
admittance that can be generically termed as Ressler JA, Muzzin AM, Tomlanovich MC: Critical
“eloping.” A primary reason for eloping is the care in the emergency department: A physiologic
long waiting time in the ED. Based on different assessment and outcome evaluation. Acad Emerg
eloping categories the ED will loose different Med 2000, 7:1354-1361.
segments of revenue streams. Some factors [10] Nelson M, Waldrop RD, Jones J, Randall Z:
influencing long wait times include bed holding, Critical care provided in an urban emergency
MD and staff throughput efficiency per patient. department. Am J Emerg Med 1998, 16:56-59.
Eloping has negative implications on ED [11] Varon J, Fromm RE, Levine RL: Emergency
department procedures and length of stay for
revenue and subsequent loss of goodwill for the
critically ill medical patients. Ann Emerg Med 1994,
hospital. This paper analyzes loss of patients 23:546-549.
(elopement) to identify trends and relevant [12] Svenson J, Besinger B, Stapczynski JS: Critical
explanations of the same. care of medical and surgical patients in the ED:
Length of stay and initiation of intensive care
procedures. Am J Emerg Med 1997, 15:654-657.
64
Proceedings of the 2008 IEMS Conference
Matthew E. Elam
Texas A&M University-Commerce
matthew_elam@tamu-commerce.edu
65
Proceedings of the 2008 IEMS Conference
the U.S. must cultivate the STEM talents of all To ensure that the manufacturing sector
of its citizens, not just those from groups that continues to be a driving force for economic
have traditionally worked in STEM fields. development, Alabama must have an adequate
workforce that is not only knowledgeable in
Many efforts have been put forth to increase manufacturing principles but also
the percentage of underrepresented populations technologically educated in advanced
in the STEM workforce. These include summer technological platforms. This is main motivator
and after-school programs for high school for the development of a web-based tool that
students and the integration of STEM assists middle-school and high-school students
applications into high school curricula. The to learn fundamental concepts in advanced
unfortunate outcome of many of these efforts is manufacturing technologies.
that they are temporary and disappear once they
no longer have funding. Their positive effects 2. The Tool
then dissipate over time, and the same problems This section describes the software developed
with lack of diversity and shortage of human to help middle and high school students
resources in STEM disciplines resurface. assimilate basic manufacturing engineering
The fact that women and minorities are concepts. Images of the software screens are
underrepresented in the engineering labor force presented as figures.
has led to the development of the following
three questions: 2.1 Welcome Page
• What are the psychological mediators of Figure 1depicts the "Welcome" page of the
both gender and ethnic group differences in software. This page briefly describes
entry into and persistence in the engineering
labor force?
• What are the family and school forces that
underlie the gender and ethnic group
differences in these psychological
mediators?
• How do experiences in tertiary educational
settings and in manufacturing work settings
influence gender and ethnic group
differences in entry into and persistence in
the engineering labor force? (Eccles and
Davis-Kean, 2003).
66
Proceedings of the 2008 IEMS Conference
67
Proceedings of the 2008 IEMS Conference
68
Proceedings of the 2008 IEMS Conference
3. Conclusions
4. Acknowledgements
The authors want to thank the Alabama
Partnership for Community Partnership for
funding the development of this tool.
5. References
69
Proceedings of the 2008 IEMS Conference
70
Proceedings of the 2008 IEMS Conference
table gives an overview of the number of on- Harvard researcher Stefanos N. Kales and
duty deaths and injuries that occurred from 1997 colleagues analyzed data on all firefighter deaths
to 2006. between 1994 and 2004, except those linked to
the 9/11. The research found that very high
Table 1: On-Duty Deaths and Injuries that percentages of firefighters are more likely to die
Occurred from 1997 to 2006 [2]5 of heart disease. Firefighters job expose them to
extreme exertion to toxic and high risk
environment. The heavy gears needed to
Year Deaths1 Fireground Injuries2 Total Injuries2
accommodate this type of job forces the need to
be fitted to curry such equipment. The
1997 100 40,920 85,400 environment and the heavy gears make it more
deadly for firefighters. Understanding the risk
1998 93 43,080 87,500 around firefighters and the reason behind that is
very vital.
1999 113 45,550 88,500 Research interest is to minimize the line of duty
deaths whether it’s due to heart attacks or other
related incidents on the scene. Some research
2000 105 43,065 84,550
has been done to maximize the safety and
minimize the deaths by tracking firefighters
2001 1053 41,395 82,250 through the incident scene and monitor their
status. A company in Florida announced that
2002 100 37,860 80,800 they will introduce a firefighter tracking system
that is called Pak-Tracker Firefighter Locator,
which integrates a motion sensor device with a
2003 113 38,045 78,750 transceiver to send and receive critical
information between individual firefighters and
2004 1194 36,880 75,840 a command station. The product was supposed
to be introduced in April of this year in the Fire
2005 115 41,950 80,100
Department Instructor's Conference in
Indianapolis. The prototype for the product
shows that it has a large size that can add more
2006 106 N/A Yet N/A Yet weight to the firefighter’s gear. Also there was
no price prediction that was released with this
1
prototype.
Researchers at Worcester Polytechnic Institute
just announced in August of 2007 that they have
1
been able to reduce the size of the locator
This figure reflects the number of deaths as transmitter so every firefighter can wear one.
published in USFA's annual report on firefighter The researchers expect that the system will be
fatalities. All totals are provisional and subject to
available in two years. The technology used in
change as further information about individual
fatality incidents are presented to USFA. the system was not released or published.
2 This figure reflects the number of injuries as
published in NFPA's annual report on firefighter This article will be looking at the design criteria
injuries. based on the previous research and products then
3 In 2001, an additional 341 FDNY firefighters, three present the conceptual design of a prototype
fire safety directors, and two FDNY paramedics died
in the line of duty at the World Trade Center on
September 11. the annual USFA report on Firefighter Fatalities in
4 The Hometown Heroes Survivors Benefit Act of the United States beginning with CY2004.
2003 has resulted in an approximate 10% increase to 5 Was published by U.S. 1Fire Administration, 2006
the total number of firefighter fatalities counted for
71
Proceedings of the 2008 IEMS Conference
system for tracking firefighters at fire scenes “training burn” by the Orange County Fire
based on Geographic Information system (GIS). Rescue Department (OCFRD) in mid 2007. We
The system consists of a base and mobile units. have also met and interviewed a few firefighters
The base unit consists of two reader antennas, with OCFRD. We believe that such a needs
fire-scene geo-referencing and management assessment study is fundamental not only from a
(GeoMan) software, and a laptop computer. The customer needs perspective, but also from a
mobile unit consists of active radio frequency conceptual design standpoint. From the training
identification (RFID) tag, an electronic compass, burn and the interviews we learned that weight,
and a Bluetooth transmitter. The GeoMan size, and power efficiency are really important
software in the base unit creates a local geo- for many reasons. First, firefighters are already
referencing system at the fire scene, processes loaded with devices that have the weight and
the received radio signals to infer the size to the extent that adding another device is a
instantaneous locations of the mobile units real burden. Figure 1 shows the QFD that
(firefighters), and display the real-time locations explains the customer needs and the relation
of these units in the locally defined reference with the product characteristics
system. The mobile units continuously broadcast
signals, which are received by the reader
antennas and processed by the GeoMan software
to track firefighters at the fire scene. The
electronic compass provides basic local
orientation information of firefighters, which is
transmitted to the base unit via the Bluetooth
transmitter installed in the mobile unit. Local
orientation information is also valuable for
inferring firefighter’s tracks in the events of
losing radio signals.
4. Conceptual Model
In the model proposed we considered the
following characteristics: light weight, small
size, power efficiency, real-time tracking, and
accurate positioning. The first three Figure 1: QFD for Tracking System
characteristics are very essential from a
firefighter perspective. The other characteristics
are crucial to the success of any tracking system 4.1 System Description
since pinpointing firefighter’s locations in real The proposed system consists of (a) a light
time in fire scenes is critically needed. This is weight, small size mobile unit that has a
important especially in the situations when a Radio Frequency ID (RFID) tag and a
firefighter is not responding or instantaneously digital compass, and (b) a base unit. The
un accounted for. Real-time tracking enables RFID will transmit firefighter information (a
the fire scene command center to not only unique identifier for the firefighter) to the
account for firefighters, but also to use their base unit. The direction and orientation
relative positions for making life-saving information of the firefighter will be
decisions. Accurate positioning is an important broadcasted to the base unit by the
characteristic given the size of fire scenes and electronic compass via Bluetooth. . The
the large scale map needed to show the locations. base unit is connected to two RF reader
To design a practical, usable tracking system it antennas, which will be used for inferring
needs to be power efficient as well. firefighter’s positions by triangulation. The
In order to better understand the process of positions and direction/orientation
firefighting more efficiently, we have attended a information will be processed by the
72
Proceedings of the 2008 IEMS Conference
proposed GeoMan software and a real-time we venerate them and although we care deeply,
map will be produced and displayed on the we are still resilient and never take on the full
laptop computer of the fire scene command effect of their tragic demise. We simply forget
center. . Figure 2 shows the architecture of about the families of these astounding men and
the proposed system women who are left with nothing but
despondency.
The main goal of designing the proposed
prototype system is to provide an efficient real-
time tracking device that satisfies firefighter’s
needs. Such a system will help save the lives of
those great people through accurately tracking
their position in a fire scene. Based on the needs
assessment study, the characteristics of light
weight, small size, power efficiency, and
reasonable cost have been identified to be
essential for designing a firefighter tracking
system that satisfies customer needs and operate
efficiently.
References
74
Proceedings of the 2008 IEMS Conference
Abstract 10.0
The demand for electricity in the United States 8.0
is increasing. Nuclear power plants have better 6.0
capacity, price, and environment impact than other 4.0
types of power plants. In the last few years, the 2.0
nuclear power industry has been revitalized. There
0.0
is an urgent need to find new engineers to support
the needs of the nuclear power industry. This
95
97
99
01
03
05
paper discusses this issue and proposes some
actions that can be taken quickly by educational
Figure 2. U.S. Electricity Production Costs
institutes and industry to effectively address the
cents per kilowatt-hour
engineering needs of the nuclear power industry.
The coal power plants face a major challenge
1. A looming nuclear power renaissance of reducing carbon dioxide emissions. Meeting
The demand for electricity in the United States these requirements will raise the price of energy
is increasing. According to the Department of generated by coal power plants.
Energy, the demand for electricity in the United
States will increase by 45% by 2030[1]. Currently, the amount of green energy
generated from wind and solar power accounts for
less than 5% of the total energy consumption.
75
Proceedings of the 2008 IEMS Conference
increased about 500 percent since 2002, including Nuclear reactors emit almost none of the
a doubling in price last year, the operating costs of greenhouse gases responsible for global warming.
nuclear power plants have changed very little. Without the nuclear power, the U.S. electric sector
Moreover, the rise in price has prompted an carbon dioxide emissions would be 27% larger.
exploration boom that will ultimately lead to more
mines and a greater supply. Construction accounts In summary, nuclear power offers the
for as much as ¾ of the cost of nuclear generation. possibility of large quantities of base load
The construction of nuclear power plants has electricity that is cleaner than coal, cheaper and
transitioned to a manufacturing process. Large more secure than gas, and more reliable than wind.
modules are shop-fabricated and then assembled
on site. This practice significantly reduces the time In addition to the cost and environmental
and cost to build nuclear power plants (see [2]). concerns, the Unites States would like to be less
dependent on foreign oil.
76
Proceedings of the 2008 IEMS Conference
decline. The public got scared. The regulatory Other countries such as France, Japan, and
environment tightened, and as a result cost went Korea are moving ahead in terms of nuclear power
up. Billions were spent bailing out companies plant building and educational programs in the
making loss on nuclear power plants. For two field. The outsourcing of jobs in the nuclear power
decades, neither the government nor bankers engineering to other countries is not desirable for
wanted to touch nuclear power plants. The nuclear the US. If we do not take actions immediately,
power industry in the US had experienced almost what happened to the manufacturing industry may
no development during that period. happen to the nuclear industry.
As a result, many of the existing nuclear 3. How to address the needs of the nuclear
power plants are using outdated technologies. This power industry?
is partly the reason that students are not interested There are needs for two types of engineering
in a job in the nuclear power industry. Therefore, resources in the nuclear power industry:
there are very few young engineers in the field of operators/maintenance workers and engineers. The
nuclear power industry. Very few young engineers needs of operators and maintenance workers can
even thought about entering the nuclear power be addressed by the community colleges with a
industry. two year degree relatively quickly. The
community colleges are usually more flexible in
On the other hand, over 25 new nuclear power changing their programs and curricula. They are
plants have been announced recently [7] and also more responsive to the needs of industry. The
efforts are in full swing to revitalize the industry. need for engineers is more difficult to meet. There
This new boom requires thousands of engineers to are very few four year nuclear power engineering
take on the expanding engineering tasks related to technology programs in the US. University of
the design, operation, and maintenance of the North Texas is one of them
plants [Error! Reference source not found.]. To [3]. The number of graduates from these
make matters worse, many current engineers in the programs falls far short of the needs by the nuclear
nuclear industry are at or near retirement age. Due power industry due to the latest boom in the
to the nature of the nuclear power industry, it takes industry. People in both industry and academia are
a few years for a new engineer to go through the realizing the seriousness of the problem faced by
rigorous training program in nuclear industry. As a the nuclear power industry.
result, there is an urgent need to find new
engineers to join the nuclear power industry work Faced with the challenge of providing the
force to support the existing and new nuclear increasing need of engineers by the nuclear power
power needs. Currently, the US education system industry, universities and the nuclear power
lacks the capability to provide the nuclear industry industry are making efforts to address this issue.
with the volume of qualified engineers needed by For instance, at Texas A&M, a nuclear
the industry. engineering technology program is being created
within the Department of Engineering Technology
Many nuclear engineering programs prepare and Industrial Distribution. However, major
their students for a career in academia and national changes in the educational system face resistance
research labs rather than the nuclear power in academia [4]. In addition, the effect of this type
industry since there was almost no need for new of changes will not be seen by the industry in the
engineers from the industry in the last two decades. next 4 to 5 years. But the sudden increase in need
The Navy has been one of the major sources of from nuclear power industry is immediate. A
nuclear power engineers in recent history. As a quicker and more efficient way to address the
result, students in the nuclear power engineering immediate needs of the nuclear power industry is
programs in many universities are getting more needed.
theoretical training and encouraged to do
fundamental research rather than the practical and We propose the following steps to partly
hands-on experience. They are not prepared for the address the immediate needs of the nuclear power
industrial jobs. industry:
77
Proceedings of the 2008 IEMS Conference
78
Proceedings of the 2008 IEMS Conference
5). Summer/Spring Student Internship having senior project design related to nuclear
Students interested in nuclear power industry power plants, student summer interns, faculty
will be encouraged to work as summer/spring summer fellowships, research projects supported
interns for the nuclear power plant companies. The by the nuclear industry, and a new nuclear power
internship provides an opportunity for both the engineering technology program are just some of
students and the company to know each other. the steps that we can take to address this issue.
During the internship, the students will learn the One of the objectives of this paper is to increase
basics of the nuclear power plant, some practical the awareness of this issue in academia so that
engineering experience, and better ideas about more universities will work on the solution to this
what they should be focusing on when they come problem.
back to school. The fourth author, Jacob Schulz,
worked at South Texas Project Nuclear Operating 5. References
Company as a student summer intern. His [1] Lighting the future with clean nuclear energy,
experience as an intern resulted in a white paper GP0066, Nuclear Energy Institute, Dec. 2006.
on the recruiting and retaining of young engineers
[5], and gave him a better idea of the coursework [2] T. Hurst, “Transfer ABWR construction techniques
to U.S. shores”, POWER, Vol. 151, No. 5, May 2007,
required to be successful in the industry.
pp. 43-46.
6). Recruiting [3] M. Plummer, J. Davis, and C. Bittle, Management
Again, because of the down turn in the nuclear changes as a threat to onsite delivery of nuclear
power industry in last two decades, very few engineering technology programs, 2007 ASEE
students know much about the industry. The Conference. AC 2007-3090.
nuclear power industry and universities need to
[4] Report of the Nuclear Energy Research Advisory
work together to increase the awareness of the
Committee Nuclear Power Engineering Curriculum
nuclear power industry among the students. This Task Force, Department of Nuclear Engineering &
can be achieved by teaching nuclear related Radiation Health Physics, Oregon State University,
material in different courses, nuclear power April 7, 2004.
industry participating in the career fairs, and
collaboration on research projects between nuclear [5] Ideas for recruiting/retaining young engineers, STP
power industry and universities. With a close tie white paper, J. Schulz, 2007
between the faculty members and nuclear power
[6] The changing climate for nuclear energy: Annual
industry established, the nuclear power industry briefing for the financial community, Nuclear Energy
would have a much better chance of finding the Institute, Feb. 22, 2007.
students they need. This can be done through
internships, sponsoring senior design projects, and [7] New Nuclear Plant Status, Nuclear Energy Institute,
founded research projects. August, 2007.
These intermediate steps should help Wei Zhan is currently an assistant professor in Department of
addressing the urgent needs of the nuclear power Engineering Technology and Industrial Distribution, Texas
industry as well as encourage students to enroll in A&M University, College Station, TX 77843-3367.
the 4 year nuclear power technology program. John Crenshaw is currently a vice president at South Texas
With the industry, universities and community Project Nuclear Operating Company in Bay City, TX 77414.
colleges working hand in hand, we are confident Tim Hurst is currently the principal I&C engineer at South
that the engineering needs for the nuclear power Texas Project Nuclear Operating Company in Bay City, TX
renaissance will be met. 77414. He is also the president of Hurst Technology in
Angleton, TX 77515
79
Proceedings of the 2008 IEMS Conference
80
Proceedings of the 2008 IEMS Conference
81
Proceedings of the 2008 IEMS Conference
C3 C2
C1
Where:
C1 – capacitance of the void
C2 – capacitance of the intact part of dielectric in series with C1
C3 – capacitance of the rest, intact part of the dielectric surrounding void
Figure 2
82
Proceedings of the 2008 IEMS Conference
Figure 3
Figure 4
83
Proceedings of the 2008 IEMS Conference
Figure 5
84
Proceedings of the 2008 IEMS Conference
85
Proceedings of the 2008 IEMS Conference
86
Proceedings of the 2008 IEMS Conference
Table 1. Illustrates statistical analysis of model results description of original work performance
representing Thorndike's work: results.
Average
Additionally, it is interesting to observe that
Learning Coefficient
rate number Standard of
suggested Thorndike's modeling performance
Variance
of
σ appears to be analogously similar to behavioral
value
σ variation
η training
cycles
deviation
ρ performance of another ACS optimization
0.1 100.7 93.7889 9.6845 0.0962 model. By applying ANT-density algorithm for
0.2 51.3 22.6778 4.7621 0.0928 solving some TSP, results therein, [7], show
0.3 34.8 10.1778 3.1903 0.0917 that efficiency per ant (required to reach
0.4 26.5 6.2778 2.5055 0.0945
optimum) well improved as number of ants
0.5 21.5 2.9444 1.7159 0.0798
0.6 18.2 2.4 1.5492 0.0851 increase see figure 5. Consequently, it could be
0.7 15.8 1.5111 1.2293 0.0778 stated that number of trials for given number
0.8 14.2 1.2889 1.1353 0.08 of trials for given Thorndike modeling is
0.9 12.6 1.1556 1.0750 0.0853
analogous to number of ants. That considering
ACS behavior adopting ANT- density
Virtual initial infinite
error value
algorithm for solving TSP.
120
Finally, the presented model seems to take
100
into account the mixed learning paradigms
80
constituted of reinforcement learning, [8],
Average number
error correction, [9], and interaction with
environmental conditions, [10].
60
of training cycles
40
20
Minimum error limit
0
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
87
Proceedings of the 2008 IEMS Conference
88
Proceedings of the 2008 IEMS Conference
Referring to, [6], therein, shown that both [9] Haykin, S. 1999, Neural Networks, Englewood
work for Thorndike and Pavlov are supporting Cliffs, NJ: Prentice-Hall pp 50-60.
each other for learning performance, [3],[4]. [10] Fukaya, M., et al., 1988: Two level Neural
So, it is obvious that both obeys generalize Networks: Learning by Interaction with
Environment, 1st ICNN, San Diego.
(LMS) algorithm for error minimization by
[11]Dorigo, M. et al, 1997: Ant colonies for the
learning convergence, [4]. Also, that algorithm traveling salesman problem, at web site
agrees with the behavior of brainier mouse www.iridia.ulb.ac.be/dorigo/aco/ aco.html
behavior (that is genetically reformed) as [12] Hassan, H. M., On principles of biological
shown at, [13]. Additionally; adaptation information processing concerned with learning
equations for all of presented systems in the convergence mechanism in neural and non neural
above are running in agreement with dynamic bio-systems" Published at CIMCA 05 ,28-30
behavior of each other. Moreover, the learning Nov.2005,Vienna ,Austria.
algorithms for suggested systems are close to [13] Hassan, H. M., 2005: On quantitative
each other with similar iterative steps (either mathematical evaluation of long term Potentiation
and depression phenomena, using neural network
explicitly or implicitly). Finally, it is precious
modeling, published at SimMod 2005, pp.237 –
to note that the rate of increase of speed of 241.
response is analogous to rate for reaching [14] H. M. Hassan "On Learning Performance
optimum average speed in ACS optimization Evaluation for Some Psycho-Learning
process. Moreover, the increase on number of Experimental Work versus an Optimal Swarm
artificial ants is analogous to number of trials Intelligent System." Published at IEEE -
in Pavlov’s and Thorndike's work, as shown at Symposium on Signal Processing and Information
[1],[6],[14]. On the basis of above learning Technology ISSPIT 2005 Fifth Symposium held
algorithmic models, the issue of optimal during 18-20 Dec. 2005, Athens – Greece.
methodology for teaching how to read has [15]H. M. Hassan, et al, "Towards Evaluation of
Phonics Method for Teaching of Reading Using
been recently investigated, [15].
Artificial Neural Networks (A Cognitive Modeling
Approach)",to be published at IEEE Symposium on
References Signal Processing and Information Technology
ISSPIT 2007 , Seventh Symposium , 15-18 Dec.
[1]H. M. Hassan, 2004.” On evolutional study of 2007,Cairo-Egypt.
comparative analogy between swarm smarts and
neural network systems (part 1), published at NICA
2004, Natural inspired computer and applications,
Hiefe, China,
[2]Simon Garnier, Jacques Gautrais, and
Guy Theraulaz, 2007 “The biological principles of
swarm intelligence "Swarm Intelligence Journal
Volume 1, Number 1 / June, pp.3-31
[3] Thorndike E.L., 1911: Animal Intelligence,
Darien, Ct. Hafner. October 24-29
[4] Pavlov, I.P., 1927: Conditional Reflex, An
Investigation of The Psychological Activity of the
Cerebral Cortex, New York, Oxford University
press.
[5] Hampson, S.E., 1990: Connectionistic Problem
Solving, Computational Aspects of Biological
Learning, Berlin, Birkhouser..
[6] Hassan H. and Watany M.,2003: on
comparative evaluation and analogy for Pavlovian
and Throndikian psycho-learning experimental
processes using bioinformatics modeling, AUEJ,
6,3, 424-432, July.
[7] Alberto C., et al., 1991: Distributed
optimization by ant colonies. Proceeding of
ECAL91, Elsevier Publishing pp 134-142.
[8] Mariarty,E.D et al,1999: Evolvolutionary
algorithms for reinforcement learning, journal of
AI research 11:241-276.
89
Proceedings of the 2008 IEMS Conference
90
Proceedings of the 2008 IEMS Conference
This research aims to study the relationship The types of nursing error relate to the 5 R’s --
among some key variables that was identified to right patient, right dose, right medication, right
be factors that contribute to nursing errors and route and right time – of medication
identify the key associations that need to be administration, The values that this variable
prioritized. Data from a local hospital were could have include, Medication incorrect,
collected and analyzed in this study. Omitted medication, Dose/Rate incorrect, Other,
Extra medication, Medication given to wrong
2. Methodology patient, Time that medication was administered
was incorrect, Route incorrect, Label incorrect,
2.1 Sampling and data collection and Order manipulation.
Variance categories of the error (i.e.
Nursing error data in this study were collected dispensing variance, prescribing variance, and
from a large hospital located in the southeast medication administration variance), and reasons
region. A voluntary reporting system for and contribution factors (i.e. dosage or rate
reporting medication errors was used in the incorrect, wrong medication, extra medication,
hospital. The form used in this hospital is similar label incorrect, wrong patient and etc.) for such
to the U.S. Pharmacopeia (USP) Medication variance to occur, and severity of the variance
Error Reporting Form which is commonly used were recorded in the form by the nurses. To
to report medication errors to the USP. better understand the variances defined in the
A random sample of 6000 medical variances form, Table 1 provides more detailed description
(errors) that spanned a 4 year period was of the categories of the variances. Variances are
collected from the hospital. Based on the classified into categories from A to I.
literature review and the availability of Categories A and B are used to identify process
information, the following variables were improvement issues and represent near misses.
identified and selected in the sample: (1) day of Categories E through I usually require the
the week, (2) the appearance of the medication immediate notification of risk management.
on the list of high risk medication or confused
names, (3) type of error, and (4) variance 2.2 Statistical Analysis
category. Initially reports had a time that the
error was committed but it was thought that this All the variables identified in this study were
information was incorrect so this information is discrete in nature.
no longer reported. It is assumed that the
medication error occurred on the same day that 2.2.1 Descriptive statistics
was reported.
Day of the week provides information of on Descriptive statistics for the five discrete
what day of the week the nursing error occurred. variables can be seen in Tables 2 through 6.
Its values include Sunday, Monday, Tuesday, Results indicated that the occurrence of errors is
Wednesday, Thursday, Friday, and Saturday. not evenly distributed across all the days of the
High-risk medications, as defined by the week. Wednesdays have the most number of
Institute of Safe Medication practices, are drugs errors and Sundays have the least as shown in
that bear a heightened risk of causing significant Table 2.
patient harm when it is used in error. Examples
on the list of High Risk Medications include
insulin and oxycotin, if given intravenously.
Confused drug names are those that have look-
alike and sound-alike name pairs and consists
only of those name pairs that have been involved
in medication errors published in the ISMP
Medication Safety Alert. An example of a pair
of drugs on this list includes Cedex and Cidex.
91
Proceedings of the 2008 IEMS Conference
92
Proceedings of the 2008 IEMS Conference
93
Proceedings of the 2008 IEMS Conference
In addition, all the first order effects were medication safety system – preventing high risk
medication errors at the point of care,”
found to be significant. Journal of Nursing Administration, 34, pp. 437-439.
5. References
94
Proceedings of the 2008 IEMS Conference
95
Proceedings of the 2008 IEMS Conference
96
Proceedings of the 2008 IEMS Conference
quality values results in conflict, false However, management by fact is not often
expectations and diminished capacity (Kouzes achieved. It is more likely that employees in a
and Pozner, 1993, p.121). Kotter and Heskett deceptive workplace culture will ignore the facts
(1992) argue that organizations with adaptive, or provide deceptive explanations to mitigate
performance enhancing cultures out performed negative reactions. Bies, Shapiro, and
non-adaptive, unhealthy ones precisely because Cummings (1998) and Shapirom Buttner and
of the emphasis on attending to their Barry (1994) found that Lies, excuses and
constituencies. These constituencies can be stories are used to reduce conflict and mitigate
described as the organization’s customers, negative effects of a decision or action. When
stockholders and employees. They found, by failures or harmful outcomes occur, those
contrast, that in organizations with non-adaptive involved may lose a great deal of status, power,
and unhealthy cultures, most managers are credibility and their competence may come in to
motivated by self interest and deception question (Sitkin & Bies, 1993; Sutton &
becomes a common organization behavior. Callahan, 1987). Deceptive explanations can
In our studies on hypocrisy (Phillipe and serve to plausibly deny responsibility for the
Koehler, 2005) we found promoting self problem (Sitkin & Bies, 1993; Browning &
interests tend to be common and perceived to be Folger, 1993). However, such explanations only
acceptable. Our findings are supported by prevent success in implementing quality goals.
agency theory which assumes that individual The seduction of misleading excuses or
behavior is guided by self-interest (Eisenhardt, conceiving stories to protect one’s status can
1989). Conflict emerges when the agent cause serious harm to the organization.
interprets the situation differently that the Most change management specialists
principal and feels it is acceptable to protect advance that change initiatives must be led to
their self-interest. Further, our finding is top management. Top management actions have
supported by ethical ambivalence theory (Jansen to be consistent with their exposed views. We
and Gilnow, 1985). This theory assumes that argue that even when top management is on
individual ethical behavior is a function of self- aboard; it does not create a culture absent of
interest. Deception, although unethical, appears deception in the workplace. Top management
to be acceptable if it is in your self-interest, e.g., actions are pivotal, but do not overcome a
ENRON. culture that foster deception. For top
Kouzes & Posner (1993) Cite a survey management to expose quality and to ignore
conducted for the American Society of Quality fundamental assumptions made by employees is
Control (ASQC) by the Gallup organization. a mistake. The first step in initiating innovative
Fifty-five percent of employees surveyed performance improvement practices is to create
reported that their companies stated that it was a culture where employee deception is not a
extremely important to show their customers common organization behavior. When cultures
that they are committed to quality. However, are able to cultivate an environment of openness
only 36 percent of the respondents stated that the and honesty where removal of blame and
company followed through extremely well on retribution are absent. Organizations that
this commitment. When asked if their companies remove workplace deception are more likely to
considered making quality everyone’s top be successful in creating cultures that achieve
priority as extremely important, 53 percent innovative performance improvement practices.
stated yes. But only 35 percent said the company
was actually doing very well in that arena (1993, 6. Summary and Future Research
p.39). When the perception of leadership action
falls behind leadership promises, the credibility This article hypothesizes that creating a
gap widens. This gap induces perceptions of culture that minimizes workplace deception is a
workplace deception. requirement for successfully implementing
Quality initiatives are likely to be innovative performance improvement practices.
successful when employees manage by facts. It provides reasoning and research to support our
This principle is often cited by organizations. hypothesis. It suggests the companies in the
97
Proceedings of the 2008 IEMS Conference
United States much meet the challenge of Erickson, C.L. and Jacoby, S.M. (2003). The effects
removing workplace deception in their culture if of Employee Networks on Workplace Innovation and
they are going to be successful in implementing Training. Industrial and Labor Relations Review, Vol
performance improvement initiatives such as 56. No. 2
Total Quality Management, Six Sigma, and
Erickson, C.L. and Jacoby, S.M. with Copeland, R.,
Lean.
Goldschmidt and Tropher, J. (1998) “Training and
Work Organization Practices of Private Employers in
Future research should focus on the California.” Policy Research Report, California
development of a workplace deception Policy Seminar, Berkeley.
measurement instrument once the instrument is
validated, a test of the proposed hypothesis Grover, S.L. 1993, Lying Deceit and Subterfuge: A
could be tested by administering the instrument model of dishonesty in the workplace. Organization
in companies that have been successful and Science, 4, 478-795.
unsuccessful in implementing innovative
performance improvement practices. If the Gittleman, M. Horrigan, M. & Joyce, M. (1998).
proposed theory is correct companies that are “’Flexible’ Workplace Practices: Evidence from a
successful should have cultures that are Nationally Representative Survey.” Industrial and
significantly less in workplace deception. Labor Relations Review, Vol. 52, No.1 (October), pp.
99-115.
Eisenhardt, K.M. 1989. Agency theory: As Shapiro, D.L. 1991. The effects of explanations on
assessment and review. Academy of Management negative reactions to deceit. Administrative Science
Review. 14, 57-74 Quarterly. 36, 614-630.
98
Proceedings of the 2008 IEMS Conference
Abstract
The JTDI system was developed by the Naval
The objective of the Joint Technical Data Aviation Systems Command in 1998 to alleviate
Integration (JTDI) project at Morgan State the use of aviation maintenance paper based
University is to demonstrate specific Department processes. The military wanted a system to
of Defense advances in aviation maintenance improve communication and data reliability
and aircraft availability in ways that would among support groups. These groups consisted
benefit the civil aviation community. This of weapon systems program offices and soldiers
project will describe a research strategy to [1].
configure an information and system
architecture to assist the civil aviation industry The key components of the JTDI system are:
through a collaborative project of academia,
industry, and government. Deployable caching servers. These
servers provide automatic technical
1. Introduction manual data updates.
The aviation industry utilizes vast amounts of
paper aircraft maintenance documents. The Deployable test systems. These systems
conversion to a paperless environment provides receive technical manual updates from a
maintenance data that is more organized and caching server and supply the manual
easily accessible. The Joint Technical Data information to the Portable Electronic
Integration (JTDI) system is a 3-three-tiered Display Devices (PEDDs) at the
web-enabled Delivery Management System that worksite.
provides updated maintenance information to
individuals in the field of aviation. The Unites Portable Electronic Display Devices
States Military utilizes JTDI to deliver (PEDDs) also known as Portable
Interactive Electronic Technical Manuals Electronic Maintenance Aid (PEMA)
(IETM) to its pilots. These electronic manuals display the technical manual
are accessed on demand electronically by information to the maintainer at the
Portable Electronic Display Devices (PEDDs) worksite.
rather than pull maintenance information from
stacks of paper manuals [1]. • TechCam capabilities. Real-time
collaboration tools to communicate with
2. Background technical representatives and subject
matter experts.
The Joint Technical Data Integration (JTDI)
system is a 3-three-tiered web-enabled Delivery The JTDI system design utilizes Commercial
Management System that provides updated Off the Shelf (COTS) hardware and software
maintenance information to individuals in the products, which can be easily upgraded over
field of aviation. time.
99
Proceedings of the 2008 IEMS Conference
3. Objective
The research objective is to determine what Figure1. The architecture of the JTDI system
roles the JTDI technology can play in civil
aviation. JTDI can impact flight safety when Figure 1 displays three levels: the Data Owners,
integrated with civil aviation maintenance. JTDI (Top Tier and Mid Tier) and JTDI Users.
Examples are: The Data Owners provide the data for the
system at the top tier. The Subject Matter
Increase the effectiveness of information Experts (SMEs) provide supplemental
through Dynamic Personalized Data knowledge for the system data via video
Delivery. JTDI can provide the right conferencing at the top tier. The top tier also
data to the right user at the right time. provides a system help desk. The JTDI Users are
the on-site employees who are available to
Enable continuous access to provide the updated data to the war fighters [2].
aviation/aircraft/safety data by aircraft
operators and maintainers. The top tier capabilities are:
Morgan State University has two JTDI Provides search capability or direct
systems. One system is located at Morgan State synchronization via PEMA or work
University School of Engineering in Baltimore, station
MD. The second JTDI system is located at the
Intergraph Corporation in Patuxent River, Same standard look and feel on the Mid-
Maryland. These redundant systems are Tier/Top-Tier for each weapon system
implemented in 3 tiers. The top, middle and
PEMA level tiers synchronize to provide the Weapon system oriented Web sites with
maintenance data. standard look and feel for each weapon
system
100
Proceedings of the 2008 IEMS Conference
101
Proceedings of the 2008 IEMS Conference
102
Proceedings of the 2008 IEMS Conference
The maintainer equipment is another [2] Eddie Boyd, “The Investigation of Three-Tiered
consideration. Currently JTDI provides COTS Software Technology for Utilization of
maintenance manuals on PEMAs (Laptop) Data Acquisition and Abstraction of Aviation
devices. Although PEMAs provide a Data”, 2007 CIBAC Summer Institute Research
Reports, August 2007
paperless method of obtaining information,
[3] S. Cummings 2000, Summer. Joint Technical
the maintainer can experience mobility Data Integration Knowledge as a Force
limitations with PEMAs when servicing Multiplier. The DISAM Journal. Summer
parts of an aircraft, such as the wing area. 2000(issue), p. 57-59
Hands free devices will be researched to
supply data to maintainers, such as Head [4] Joint Technical Data Integration
Mounted Displays (HMDs). https://jatdi.redstone.army.mil/
103
Proceedings of the 2008 IEMS Conference
104
Proceedings of the 2008 IEMS Conference
105
Proceedings of the 2008 IEMS Conference
This is a dual concept. One related to the This is one of the most important
format which has been already discussed in the dimensions due to the difficulty of assessment
previous dimension (attractiveness). The other because of its intangibility. This item can be
concept is discussed in terms of: clarity of idea; measured in terms of: freedom of interest or
clarity of questions (simple/complicated/ benefit from the crew of the program to cover an
sophisticated); clarity of answers; announcer issue more than another; honesty in selecting
probing for the guest answers; and clarity of and covering issues in terms of the professional
conclusion. criteria; freedom from any compliment in
selecting figures/guest speakers who are
2.3. Diversity of content participating in the programs; and finally
flexibility to announce facts, figures, evidence,
This dimension tries to measure the extent proofs…etc. as it is.
by which the content of TV satellite channel
106
Proceedings of the 2008 IEMS Conference
107
Proceedings of the 2008 IEMS Conference
108
Proceedings of the 2008 IEMS Conference
Potential Viewers
Questionnaire Distribution
Dimensions Defining
and Ranking Data Collection
Dimensions Ranking
Statistical Data Analysis
Recommend improvement
actions for each channel in
Improvement Take Place their malfunction dimensions
Figure (1) Framework for Service Quality Improvement of TV Satellite Channel Programs
109
Proceedings of the 2008 IEMS Conference
5. References.
[1] A. Agus, S. Barker, and J. Kandampully, “An [12] R. Sebastianelli, N. Tamimi, and M. Rajan,
exploratory study of service quality in the “Consumers' perceptions of the factors affecting
Malaysian public service sector”, International e-quality”, Decision Sciences Institute
Journal of Quality and Reliability Management, Proceedings, 2003, pp. 1855-1860.
Vol. 24, No. 2, 2007, pp. 177-190.
[13] N. Srikatanyoo, and J. Gnoth, “Quality
[2] P. Bedi, and V. Gaur, “Multi dimension quality dimensions in international tertiary education: a
model of MAS”, Proceedings of the 2006 Thai prospective students' perspective”, Quality
International Conference on Software Management Journal, Vol. 12, No. 1, 2005, pp.
Engineering Research and Practice and 30-40.
Conference on Programming Languages and
Compilers SERP'06, Vol.1, No.1, 2006 pp. 130- [14] W. R. Wright “Functional analysis and mass
135. communication”, Public Opinion Quarterly, Vol.
24, 1960, pp. 610-613.
[3] D. A. Garvin, “What does product quality really
mean?”, Sloan Management Review, Vol. 26 [15] N. Ye, and J. Jia, “Customers' perceived service
No.1, 1984, pp. 25-43. quality of internet retailing”, Proceeding of the
International Conference on Service Systems and
[4] D. A. Garvin, “Competing on the eight Services Management, 2005, pp.514-519.
dimensions of quality”, Harvard Business
Review, Vol. 65, No. 6, 1987, pp. 101-108.
110
Proceedings of the 2008 IEMS Conference
111
Proceedings of the 2008 IEMS Conference
that this vast “knowledge base” can be used to technology from providing information from a
produce advice for a non-expert to solve a single expert to a knowledge-based system
problem to which they have not gained providing appropriate advice from an ever-
appropriate experience as if they were equal to growing number of experts.
the expert. This is quite a challenge. Students
were having difficulty understanding the nature FIGURE 1
of these systems and how they differ from COMPONENTS OF AN EXPERT SYSTEM
traditional information systems from a design Dialog Screen – Determine Nature of Problem
standpoint. Figure 1 outlines the components of Knowledge Base – A collection of “If ...Then”
an expert system. statements From Experts
In support of this course, which was my Inference Engine – Software to search
most challenging to teach, I developed a model knowledge base
of an expert system to demonstrate to the Presentation – Determine how to display advice
students how such a system is developed using a to system user
language, CLIPS. The model involved picking
two individuals a year who were experts in their FIGURE 2
field and who were living at the time so CONFERENCES WHERE EXPERT
theoretically I could have interviewed personal SYSTEM PAPER WERE PRESENTED
to capture their expertise. However, these Association of Management
chosen individuals were well known enough in Decision Science Institute
their field to have had books and later Internet Industry, Engineering, and Management
articles published about them. This was the Systems
source of information for these systems. I began Information Resource Management Association
presenting my work at various conferences for
peer comments to improve the process. A list of
FIGURE 3
organizations where these papers have been
COMPONENTS OF AN INTELLIGENT-
presented and published is shown as figure 2.
BASED EXPERT SYSTEM
I realized recently that I had reached a
Dialog Screen – Determine the Nature of
collection of experts handled individually that
Advice Being Sought
had been focused on two areas of expertise –
Knowledge Base – of multiple experts in a
management and the computer industry. This
focused field of expertise
year I focused attention on adding new experts
Inference Engine – Modified to select only those
to those I had researched in the past who for this
experts appropriate
paper are also billionaires but not from the
Presentation – modified to prioritize
computer industry. I will demonstrate how
through the use of an intelligent front-end I
could build a larger system as an example of
extending the generality of my 2006 system for
my MIS undergraduate students on an
application of intelligence to expert system
technology. This system is modeled after the
2006 IEMS paper titled “An Intelligent Expert
System for Those Entering the Computer
Industry”.
Figure 3 denotes the changes required
from the tradition single expert system
development components to a development that
would add front-end intelligence and the ability
to combine a group of experts into a single
system. This would extend the value of this
112
Proceedings of the 2008 IEMS Conference
CONCLUSION
REFERENCES:
1. Beerel, Annabel (1998): Expert Systems in
Business: Real World Applications; Ellis
Horwood Publishing; New York.
2. Chandler, John S. (1998): Developing
Expert Systems for Business Applications;
Merrill Publishing; Columbus, Ohio.
3. Durkin; John (2000): Expert System Design
and Development; Maxwell MacMillan
Publishing; New York.
4. Harmon, Paul (1999):Creating Expert
Systems for Business/Industry; Wiley and
Sons, NY.
113
Proceedings of the 2008 IEMS Conference
114
Proceedings of the 2008 IEMS Conference
115
Proceedings of the 2008 IEMS Conference
REFERENCES:
1. Beerel, Annabel (1998): Expert Systems in
Business: Real World Applications; Ellis
Horwood Publishing; New York.
2. Chandler, John S. (1998): Developing
Expert Systems for Business Applications;
Merrill Publishing; Columbus, Ohio.
3. Durkin; John (2000): Expert System Design
and Development; Maxwell MacMillan
Publishing; New York.
116
Proceedings of the 2008 IEMS Conference
Isabelina Nahmens
Louisiana State University
nahmens@lsu.edu
Abstract Introduction
Six Sigma is a well known business strategy that In today’s rapidly changing markets, companies
has helped many organizations globally, across need to deliver their products and services to
all industries, by achieving huge business customers in increasingly faster times, and these
benefits through the implementation of products and services need to meet and exceed
successful projects. These projects are identified, customer expectations. Applying Six Sigma
planned, and executed by employees from methodologies allowed companies to focus on
different levels of the organization. To deliver their customers and deliver products and
those business benefits, employees first measure services with exceptionally higher quality and
the baseline performance of the key process thus allowing enabling bottom-line benefits in
output, establish a performance target and then terms of dollars and cost savings.
seeks to close the gap between the baseline and
target applying the systematic and structured Six Six Sigma is both a quality management
Sigma methodology- DMAIC. Comparing philosophy and a problem solving methodology
process performance before and after completing that is project and team-based. The Six Sigma
a Six Sigma project is not a trivial task. This approach was pioneered at Motorola and then
comparison depends on 1) types of measures witnessed its glorious application in General
(discrete or continuous), and 2) availability of Electric (GE) and IBM (Breyfogle, 1999). It has
data (complete, limited, or no data). Adopting applications in both service and manufacturing.
the right comparison technique is imperative to The principles, concepts, and tools of Six Sigma
assess the results of a six sigma improvement provide a measurable goal, a framework for
project. This paper discusses comparison problem solving, and the foundations of a
techniques including t-tests, Bayes estimation, corporate culture designed to reduce costs by
and Monte Carlo simulation. Then, guidelines to responding to the voice-of-the-customer, and
facilitate the comparison of process performance “doing it right the first time.” Six Sigma
before and after, under different circumstances methodologies are flexible and can be used to
are identified. address the various facets of the business. It can
be applied to financial, planning, service, and
The adoption of methodologies outlined in this execution functions of the organization. The
paper in companies executing Six Sigma principle of Six Sigma is simple; control the
projects under those circumstances will enable variation in the process so that there are only 3.4
them to compare their before and after metrics defects per million opportunities.
regardless of their type of measures and data Six Sigma utilizes a scientific problem solving
availability. approach to address real world problems.
Typically, a Six Sigma project is executed in
five consecutive phases; Define, Measure,
Analyze, Improve, and Control (Breyfogle,
1999; Rath & Strong, 2002), or DMAIC for
117
Proceedings of the 2008 IEMS Conference
short. The Define phase starts by creating a The result of this phase is an implemented
charter that clearly states project goals, scope solution for the problem.
and boundaries, project team, business case, and
stakeholders affected by the project. It aims to The last phase of Six Sigma application is the
set the rhythm for the project and can be thought Control phase. It deals with making sure that
of as a contract between the process owner and the improvement stays in place, and that the
the project team. Several tools can be employed gains achieved by applying Six Sigma are
in the define phase, these include; SIPOC sustained. The tools used in the control phase
(Suppliers, Inputs, Process, Outputs, and include control charts, flow diagrams, and tools
Customers), and the VOC (Voice of the to compare before and after like run charts and
Customer), Kano model, and critical to quality frequency plots. Many companies add a fixed
driver tree. The Define phase results include a period realization phase, where the improved
charter, an overview of the process, and critical process is monitored to ensure that the gains are
to quality information. sustained.
The second phase of a Six Sigma project is A key feature of a successful six sigma culture is
Measure. It aims to pinpoint where the problem the creation of an infrastructure that supports
lies in a clear and precise way. The tools and invests in process performance improvement
commonly used in the Measure phase include (Brue 2002). Six sigma projects involve major
data collection plan, control charts, gage R&R investment of resource and must deliver bottom-
(Repeatability and Reproducibility), Pareto line results. However, determining statistically
charts, and FMEA (Failure Modes and Effects the success of process improvement after the
Analysis). The Measure phase establishes a completion of a six sigma project might not be
baseline for the process performance in terms of feasible- depends on the availability of data.
calculating a process sigma or Defects per Unit Furthermore, is not a common practice in the
(DPU). It results in collecting the data that feeds cases where the data after implementation is
the analyze phase of the project. limited. Usually six sigma teams realize only the
short-term paybacks and most obvious results of
The Six Sigma process proceeds by the Analyze the improvement. A major difference between
phase, which aims to identify the root cause(s) Six Sigma and failed programs of the past is the
of the problem under consideration. The tools emphasis on tangible, measurable results
commonly used in the analyze phase include (Pyzdek 2003). Six Sigma advocates argues that
root cause analysis and statistical tools such as projects are selected to provide a mixture of
regression analysis, hypothesis testing, scatter short and long term results, that justify the
plots, control charts, and Pareto charts, among investment and the effort. It is vital that
many others. The 5-why technique can be used information regarding results be documented
to identify the root cause of the problem. The and statistically compared with the process
results of the analysis phase are the root causes baseline pre-improvement, for the project to be a
of a particular problem, with a set of potential success and properly design an effective control
solutions. plan.
The fourth phase of a Six Sigma project starts Selection of Comparison Technique
upon identifying the root causes of a particular
problem. The Improve phase deals with Comparing process performance before and after
developing, implementing, and verifying completing a six sigma project is not a trivial
solutions. The tools commonly used in this task. This comparison depends on 1) types of
phase include brainstorming, creative problem measures (discrete or continuous), and 2)
solving techniques such as Delphi method, and availability of data (complete, limited, or no
planning tools. To assess the degree of data). Adopting the right comparison technique
improvement, a new process sigma or DPU is imperative to assess the results of a six sigma
value is calculated for the improved process. improvement project. This section explores the
118
Proceedings of the 2008 IEMS Conference
use of comparison techniques such as t-tests, may observe defect levels of the output of a
Bayes estimation, and Monte Carlo simulation, process before and after improvements are
to assess the process performance of a implemented. Then the mean difference between
completed Six Sigma project. Figure 1exibits a the two repeated observations (e.g. before and
flow chart of the selection of a comparison after) is documented and compared. If the
technique. The decision about what comparison difference is significantly great then there is
test to use for a particular analysis is of vital evidence that the process improvement
importance to making unbiased and correct implemented caused some change in the
decisions about improvement results. observed metric (e.g. defect level).
Statistical Analysis One of the benefits of the t-test is that they are
usually easy to understand and generally easy to
Pyzdek, a noted quality consultant, states that perform. However, the fact that these tests are so
more than 400 tools are now available for Six widely used does not make them the correct
Sigma (Pyzdek 2003). However, most firms analysis for all comparisons. T-tests are used to
rarely go behind the basic improvement tools perform comparisons on continuous variables,
and fail to recognize the benefits from more and assumes normally-distributed data. When
sophisticated statistical tools (Evans and Lindsay dealing with non-continuous data, other
2008). The DMAIC methodology was designed statistical tests, such as Chi-squared analysis and
to be used with advanced statistical methods. proportions tests can be applied. In cases where
Hence, statistical analysis is essential in normally-distributed data cannot be assumed
implementing a continuous improvement (p.s. Anderson-Darling test of normality can be
philosophy. used to check the data distribution), non-
parametric tests should be used.
Six Sigma DMAIC methodology uses statistical
analysis to interpret statistical information to Bayesian Estimation
understand and predict process performance. A
roadmap for selecting one of the available Traditional statistical techniques such as t-tests
statistical testing procedures is illustrated in and ANOVA comparisons are applied when
Figure 2. The selection of a statistical sufficient data exists for the process after
comparison is dependent on the type of data, improvement. Nevertheless, in many instances,
whether continuous or discrete. A common there exists few (1-5) data points for the new
form of statistically comparing process process, especially when the process
performance of a specific continuous metric of a performance measure under consideration is
before and after improvements of a completed cycle time, which makes the application of
six sigma project is a t-test. A t-test is a traditional statistics a challenge. To conduct
statistical tool used to determine whether a such comparison an approach adapted from
significant difference exists between the means reliability engineering is used to demonstrate six
of two distributions or the mean of one sigma project improvements.
distribution and a target value (Evans and
Lindsay 2008). The two most widely used Weibull and Weibayes analysis has been used
statistical techniques for comparing two groups extensively in reliability engineering
(normally distributed), are the independent (Abernethy, 2000). The traditional reliability
group t-test and the paired t-test. The theory deals with product/part lifetimes.
independent group t-test is use to compare Reliability is the probability that a product will
means between two groups where there are perform its intended function for a specified
different subjects in each group. Whereas, paired period of time, under stated operating
t-test compares subjects for the two groups by conditions. Weibull analysis is based on using
matched pairs (e.g. same subjects are observed the Weibull distribution to model the lifetime for
twice, often with some intervention taking place a product or a part. The Weibull distribution has
between measures). For example, the researcher 2 parameters: shape and scale. The shape
119
Proceedings of the 2008 IEMS Conference
parameter Beta denotes designates the whether Case 1: Complete data is available after process
the Weibull hazard rate is decreasing (Beta<1), improvement project: 13, 14, 15, 15, 16, 18, 18,
constant (Beta=1), or increasing (Beta>1). The 20, 20, and 21
scale parameter Eta, also known as the
characteristic life, represents the time or age at In this case, since sufficient data exists for the
which 63.2% of the parts have failed. If Beta is application of traditional statistical analysis
equal to 1, the Weibull distribution is identical to techniques, a t-test comparison is used. The
the exponential distribution with mean Eta. In listing below indicates that the new process has
addition, a shift parameter can be applied to the a statistically significant lesser cycle time as
Weibull distribution to take into consideration indicated with the p-value of 0.001 (Running the
the parts’ minimum life. Weibayes analysis is a Anderson-Darling test of normality shows that
Weibull type of analysis that assumes a known both original and new data can be assumed to be
Beta value, this assumption can improve the normally distributed, p-value>0.1).
estimation of Eta when there are few failure data
obtained for a new product design. Case 2: Only four new cycle times were
obtained: 13, 14, 18, and 20.
The Weibull analysis can be extended to use to
modeling different types of times, which are not In this case, only a limited set of cycle time data
necessarily product/part failure related. For is available after process improvement,
example, modeling cycle time in situations therefore, Weibayes analysis will be used. First,
where few data points are available after process a Weibull distribution is fitted to the original
improvement. Weibayes analysis can be used to data. The resulting distribution has a shape
estimate a scale parameter and confidence parameter of 7.03 and a scale parameter of 23.28
interval for the new data. The approach starts by (the P-value from goodness of fit test is greater
using Weibull analysis to obtain the shape and than 0.1). After that and assuming the shape
scale parameters for the original data. After parameter stays the same, the new scale
that, Weibayes analysis is used to estimate the parameter is 17.57 (the p-value from goodness
new scale parameter by assuming the same of fit test is greater than 0.5). To conduct the
original shape or Beta (thus requiring less data statistical comparison using confidence
points for comparison). The Weibayes analysis approach, a Weibull probability plot such as that
will result in drawing parallel lines (same slope shown in Figure 4 is used. Since the confidence
or Beta) on the Weibull probability plot. A key intervals don’t overlap, this indicates that the
assumption in applying Weibayes analysis is new cycle time shows reduction from the
that resulting improvements didn’t change the original cycle time data.
cycle time distribution shape parameter (Beta) to
a significant degree. The confidence intervals Monte Carlo Simulation
for the new and original data can then be used to
assess whether a statistical difference exists or Typically, Monte Carlo simulations are used in
not. the Analyze or the Improve phase of a Six
Sigma project. For instance, this approach can
Case Study: Statistical Analysis vs. Bayesian be use to improve the capability of processes
Estimation and to quantify the expected variation of the
improvement. However, simulations are not only
A six sigma project in a manufacturing very powerful when utilized in the earlier phases
organization was conducted to reduce the cycle of a Six Sigma project; they also are a powerful
time of proposal development process. Before tool in statistical process control (Jordan 2007,
improvement, the original cycle times were 15, Perry 2006). Monte Carlo is a powerful tool
18, 19, 20, 20, 21, 21, 21, 21, 22, 22, 25, 26, 28, because it allows the use of statically model to
and 29. For this case study, two cases will be account for realistic variation and display range
used: and shape of the process output. In addition, it
can help detect hidden factors that are not
120
Proceedings of the 2008 IEMS Conference
121
Proceedings of the 2008 IEMS Conference
sigma practitioner can use Monte-Carlo Law, A., and Kelton, W.D., 2000, Simulation
simulation using Triangular distributions Modeling and Analysis, 3rd Edition,
fitted based on expert inputs on each McGraw-Hill.
step of the improved process to assess Perry, R., 2006, Monte Carlo Simulation in
whether the six sigma project resulted in Design for Six Sigma, Proceedings of
cycle time reductions. the 2006 Crystal Ball User Conference.
Rath & Strong (2002). “Six Sigma Pocket
Conclusions Guide”. Rath & Strong, a subdivision of
AON Management Consultants.
This paper has focused on providing guidance
for six sigma practitioners when comparing
process performance before and after a six sigma
project. Adopting the right comparison
technique is imperative. Based on the
availability of post improvement data, either
traditional statistical analysis, Bayesian
estimation, or Monte-Carlo simulation are used.
The type of data (discrete or continuous) is a key
factor in selecting appropriate statistical analysis
procedure. Six sigma practitioners should
always check the assumptions associated with
the procedures they apply and they should
validate the models prior to their use. The paper
is concluded with a set guidelines derived from
literature and practice in terms of comparing
process performance.
References
122
Proceedings of the 2008 IEMS Conference
123
Proceedings of the 2008 IEMS Conference
124
Proceedings of the 2008 IEMS Conference
125
Proceedings of the 2008 IEMS Conference
126
Proceedings of the 2008 IEMS Conference
127
Proceedings of the 2008 IEMS Conference
Anterior view
Anterior view
Posterior view
Figure 6. Stress contour for Posterior view
20ft vertical fall Figure 8. Stress contour for
40ft vertical fall
Anterior view
128
Proceedings of the 2008 IEMS Conference
129
Proceedings of the 2008 IEMS Conference
Atlanto-
Occipital(6)
Table 3. Ligament tears due to impact 5. Conclusion and Future Studies
FH MD LA JA TT
ATF, CF, LCL, (1), (2) A
10 3.36 MCL Results revealed that comminuted type
ATF, CF, (1), (2) B fracture occurs at the vertical fall of 20ft or
LCL,MCL, (3), (4) higher because at these heights the joints
IFL, ILL, FL. (5) were subjected to high impact force or stress.
20 4.6 PLL, ALL As the height was increased in the
ATF, CF, (1), (2) C simulations, the stress-strain plot reached the
LCL,MCL, (3), (4)
IFL, ILL, FL. (5), (6) yield strength value of cortical bone. It starts
25 11 PLL, ALL showing failure when the strain is greater
ATF, CF, (1), (2) D than 0.7 mm/mm. Also higher heights
LCL,MCL, (3), (4) resulted in partial tear to complete tear of the
IFL, ILL, FL. (5), (6) ligaments surrounding the cervical, thorax,
40 14 PLL, ALL (7)
lumbar, shoulder, elbow, hip, knee, and the
Where:
FH = Fall Height in ft ankle joint causing severe joint fractures and
MD = Maximum Displacement in mm requiring immediate medical attention.
LA = Ligaments Affected
JA = Joints Affected
TT = Type of Tear
Reference
LCL = Lateral Collateral Ligament
MCL= Medial Collateral Ligament [1] Gray, H., Gray's Anatomy, New York:
IFL = Illiofemoral Ligament Elseiver, 2005.
ILL = Illiolumbar ligament
LF = Ligamentum Flavum [2] Netter, F., Atlas of Human Anatomy,
PLL= Posterior Longitudinal Ligament Teterboro, New Jersey: ICON Learning Systems,
ALL = Anterior Longitudinal Ligament 2004.
Ankle (1) , Knee (2), Hip(3), Lumbar(4), Cervical (5),
Shoulder (6), Thoracic(7)
[3] My Ankle Replacement.com, DePuy, 2007.
A = Subluxation (1 and 2)
<http://www.jointreplacement.com/DePuy/index.html>
B = Subluxation (1, 2, 3, 4, and 5)
.
C = Complete (1, 2, 3, 4, 5, and 6)
D = Complete (1, 2, 3, 4, 5, 6, and 7) [4] Hawley, N., Free 3D Models, Free CAD
Models, Solidworks-Dassault Systems, 2007.
Figure 13 evaluates the relationships <http://3dcontentcentral.com/3DContentCentral/S
between stress and strain of bone during the earch.aspx?arg=manikin>.
vertical fall of 40ft height. It shows the [5] Sawbones Third-Generation Simulated
similarity of stress-strain curve of bones in a Cortical Bone, Material Web: Material Property
joint undergoing a compressive force. Also it Data.11 Oct.-Nov. 2007.
reveals the fractures of the ankle bones when
the yield stress is close to 120 MPa. [6] The Spinal Cord and Injury, The Cleveland
Clinic Department of Patient Education and
Health Information, 24 Nov. 2007.
Stress Vs. Strain at 40 ft-Vertical Landing
140
120
100
Stress [MPa]
80
60
40
20
0
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8
Strain [mm/mm]
130
Proceedings of the 2008 IEMS Conference
131
Proceedings of the 2008 IEMS Conference
Finally, automated infrastructure support surveys, for example. Once the problem has
provides all the processes that facilitate all been identified, iteration takes place where
the other processes, such as computers, data different ideas are exchanged and methods
transfer and retrieval systems, and the like. A tried in order to completely define and design
very basic and important ingredient of CE is the product. This part of the DFM design
Design for Manufacture (DFM). DFM is the process is very important since at this stage
practice of designing products with the major decisions will be made regarding
manufacturing in mind so that they can be the production and marketing of the product.
designed in the least time with the least cost. This iteration period can vary in length, and
Also, DFM allows a smoother transition from the length is usually subjected to a deadline
the design of a product into its production as in order to meet an optimal time-to-market
well as minimizing the cost of assembling for the product. In addition to that, the
and testing the product. Quality and iteration period allows the design to be
reliability are also affected by DFM in a continuously improved and optimized over
positive way, and therefore the needs and time as better and more complete design
satisfaction of the customers are met and the information becomes available and all the
product automatically becomes more teams involved in the design process
competitive in the market [7-9]. communicate their discoveries and ideas. A
This paper provides an overview of comprehensive planning, research and
concurrent engineering design concepts and development reduces the amount of iteration
quality tools. These tools are applied for and makes any engineering change possible
product and process improvement in a small at a reduced cost in the event that a product is
electronic manufacturing company located in being revised and redesigned. As a result
the State of Rio Grande do Sul, Brazil to the quality, cost and delivery of the product
reduce the rate at which defective parts are are greatly improved thanks to early design
produced, increase the quality of decisions and increased communication
manufactured products, increase among all members involved in the
productivity, minimize product cost, supply design, production, and marketing of the
products on time to customers, provide product [11, 12]. A typical product design
customer satisfaction, and keep the company cycle for manufacture is shown in Fig. 2.
competitive in the market place.
Concept Physical Prototype
2. CE/DFM Tools Design Design Test
132
Proceedings of the 2008 IEMS Conference
and that the use of those principles will guide needs of the market place. These goals are
and evaluate design decisions that will universal acceptance, being adopted and
ultimately lead to a good design. The axioms used worldwide; current compatibility,
must be applicable to the full range of design combined use without conflicting
decisions and to all stages, phases and levels requirements; forward compatibility, with
of the design process. Axiom 1: in good successive revisions being accepted by users;
design, the independence of functional and forward flexibility, using architecture
requirements is maintained. Axiom 2: among that allows new features to be incorporated
the designs that satisfy axiom 1, the best readily [16, 17].
design is one that has the minimum ISO-9000 certification occurs when a
information content. In order to apply both neutral and independent “registrar” uses one
axioms first the team must identify the of the ISO-9000 standards to certify a
functional requirements and constraints to be supplier. This results in an official
used when designing the product. Once designation of the supplier as an “ISO-9001
those requirements and constraints have been or ISO-9002 or ISO-9003” certified supplier.
found, the axioms should be applied to each A supplier registered in this manner is in the
individual design decision to ensure the enviable position of being seen as a reliable
success of the project under the DFM worldwide supplier of quality products and
guidelines [8, 13]. The condensed listing of services; its customers can reduce or
the DFM guidelines are found in [13, 14]. eliminate inspection of purchased parts
thereby resulting in an efficient system of
2.2. Quality tools: ISO-9000 standards global trade [18].
The ISO quality standards do not refer
Quality is a competitive advantage in the directly to the products or services delivered,
marketplace. This means that a company but rather to the production and
must have the same or better quality than its administrative processes that produce them.
competitors. The quality of competitors, It is generic enough to be applicable to any
however, is improving continuously. This industry type. Specifically, the standards
situation keeps a company improving quality focus on the need for organizational
continuously to remain a viable force in the structure, well-documented procedures, and
market place. Companies can no longer management’s commitment of resources to
tolerate adverse gaps in product quality implement quality management [15]. The 20
between their products and those of their clauses of ISO-9001 standard, the most
competitors. To prevent this, a viable comprehensive of the three standards are
program of continuous improvement must be found in [16, 18].
developed and put into practice [15].
ISO-9000 standards exist principally to 2.3. Quality function deployment
facilitate international trade. The driving
forces that have resulted in widespread Measurement and evaluation of product
implementation of ISO-9000 standards can quality is not difficult, since one can
be summed up in one phrase: “the establish conformance standards, inspect and
globalization of business”. Expressions such test products, identify defect rates, correct
as “post-industrial economy” and “the global errors, and impose performance level. The
village” reflect profound changes during quality function deployment (QFD) permits
recent decades. These changes have led to to assemble a considerable amount of
increased economic competition, increased information in a concise form, in a small
customer expectations for quality, and number of documents - QFD diagrams. The
increased demands upon organizations to graphical format is more effective in
meet more stringent requirements for quality simplifying complex information [19, 20].
of their products [16]. The QFD aims at the customer. It is
Vision 2000 report proposed four goals necessary to hear the “voice of the customer”
that relate to maintaining the ISO-9000 during the development of the products and
standards so that they continually meet the processes, identify and focus on the details
133
and decide what is important. This tool defective plastic parts from supplier, visual
simplifies a set of information through the external finish, defective raw material,
use of graphs based on matrices. The QFD defective circuit board, and improper use of
is a system that translates the requirements of the product. 2. Factors affecting the product
the customer, identifies necessary cost:
modifications before major expenses take transportation, quality program, rework, idle
place, reduces risks during the development time in the assembly line, marketing, taxes,
phase, creates basic knowledge of the and product warranty. 3. Problems related to
product, and introduce the “voice of the the production process: burnt resistance and
customer” in the development process motors, obsolete machines and equipment,
[6, 20]. This method permits to collect a lack of preventive maintenance, and
large quantity of information in the form of insufficient tools. 4.
graphs. Each customer differs in his/her from the consumer:
expectations and necessities, consequently, in delivery, high cost of the product, better
the requirements. QFD identifies customers product from the competitors, and defective
in the various stages of development life products.
cycle, and employs techniques to compile
their requirements. The QFD demands a list 4. Results and Discussions
of “WHATS” AND “HOWS”, where the
items “WHATS” are the necessities of the Over a period of three months, 625 units
customers and the “HOWS” are the installed and sold through two car
measurable requisites that can affect the manufacturers have been monitored. Out of
understanding of the requisites of the these 625 units installed, 248 units (39.7% of
customers. Quality function deployment – total units sold) were returned from the
specifically, the house of quality – is an customers to the manufacturer for repair and
effective management tool in which customer rework. The units were inspected before they
expectations are used to drive the process were sent for repair and/or rework. Table 1
under consideration [19, 20]. Some of the shows the inspection results on the 248
advantages and benefits of implementing returned units. It is observed from Table 1
QFD are: an orderly way of obtaining and Fig. 3 that burnt resistance/motor is
information and presenting it; shorter responsible for 41% of repair and rework.
development cycle; considerable reduction in About 44% of repair/rework is due to
costs; fewer process changes; reduced chance defective plastic parts and raw materials.
of oversights during process execution; an
environment of team work; consensus Table 1. Inspection results
decisions; and preserving everything in No. of
writing. QFD implementation results in a repair/rework units Percentage
satisfied customer [13, 19]. 1. Burnt resistance/motor 102 41.129
2. Defective plastic part 67 27.016
3. Electronic Manufacturing 3. Defective raw material 42 16.935
4. Improper use 21 8.468
A study has been conducted for 5. Visual external finish 16 6.452
implementing some of the CE/DFM tools in Total 248 100.00
a small size electronic manufacturing
industry located in the southern part of The customer complaints received along
Brazil. The main products of this industry with the returned 248 units were analyzed
are: sirens and alarms for the cars. These and 344 responses were registered as shown
products are distributed to the car in Table 2. It is clear from Table 2 and Fig. 4
manufacturers and retail auto shops located that defective units and low quality (68%) are
in several states. The initial study with the major causes for return of the products from
management, design and manufacturing the customers for repair and rework.
teams indicated the following: 1. Types of Based on the preliminary study results, the
problems with respect to the product: following actions were taken to minimize the
134
number of defective units reaching the existing products. The assembly line has
customers, increase product quality, been redesigned incorporating new tools and
minimize unit cost of the product, and deliver equipment. An intensive training program
the products on time to the customers. has been arranged for everyone in the shop
floor on the new tools and equipment. The
existing raw materials and plastic parts
Inspection results
suppliers have been changed to ISO-9000
120 certified suppliers. The above changes
significantly reduced repair and rework and
No. of units/percentage
100
80
increased productivity.
No. of units Product Quality: Final product testing
60
Percentage procedures have been modified. Everyone in
40
the enterprise starting from the shop floor
20 employees to top management personnel has
0 been trained in total quality management
concepts. Product quality is improved
ls
e
rt
as o r
h
us
ia
pa
is
ot
fin
ex p er
tic
at
al
e c nc e
rn
pr
te
a
Im
ra
ive
st
si
e
t
al
re
iv
fe
Vi
D
Bu
De
Type of defect
market. This was made possible due to
reduced repair and rework, and increased
Figure 3. Inspection results productivity.
Delivery Time: Product delivery time is
Table 2. Customer complaints significantly reduced due to the incorporation
Customer No. of Percentage of DFM guidelines and increase in
complaints complaints productivity.
1. Defective units 154 44.767 Customer Satisfaction: Quality function
2. Low quality 82 23.837 deployment (QFD) concepts have been
3. High cost 67 19.477
incorporated to meet the customer needs and
4. Delay in delivery 41 11.919
Total 344 100.00 requirements. This also provided continuous
feedback from the customers.
Before the incorporation of the concurrent
Customer complaints engineering concepts almost 40% of the
items manufactured and sold were sent back
No. of complaints/percentage
180
160
by the customers for repair and rework
140 within three months time of purchase. This
120 has been significantly reduced after the
100 No of complaints
80 Percentage incorporation of quality standards, design of
60
40 assembly, and QFD concepts. The customer
20 complaints have been significantly reduced,
0
and the repair and rework dropped less than
Defective units
Delay in delivery
High cost
Low quality
135
Proceedings of the 2008 IEMS Conference
be highly effective when applied in small [8] Helander, M., and M. Nagamachi, Design for
electronic manufacturing industry. Any Manufacturability: A Systems Approach to
industry that is not currently applying Concurrent Engineering and Ergonomics, Taylor
CE/DFM in its design and production phases & Francis, Washington, DC, 1992.
is urged to do so. The benefits that can be
[9] J. P. Tanner, “Product Manufacturability”,
obtained by correctly applying and Journal of Automation, , 1989.
implementing CE/DFM, QFD, and ISO-9000
are great when compared to other methods of [10] H. W. Stoll, “Design for Manufacture”,
design and production. From this study, it is Journal of Manufacturing Engineering, ,
very clear that only by the skillful integration 1998.
of design, manufacturing, and marketing an
industry today can survive in this highly [11] Boothroyd, G., Product Design for
competitive world where cost, time-to- Manufacture and Assembly, Marcel Dekker, New
market, and good quality are imperative for York, NY, 1994.
survival.
[12] Norman, R., Concurrent Product and
Process Development (CP/PD) - A Current
Acknowledgment Design Methodology: Making it Happen,
International TechneGroup Incorporated, Revised
This project was funded by FAPERGS Edition, 1990.
(Process No.: 01/1218.7), Rio Grande do Sul,
Brazil. The authors wish to thank the [13] Shina, S. G., Concurrent Engineering and
FAPERGS for providing equipment, Design for Manufacture of Electronics Products,
scholarship and travel funds. Van Nostrand Reinhold, New York, NY, 1991.
136
Proceedings of the 2008 IEMS Conference
137
Proceedings of the 2008 IEMS Conference
EH = PE + translational KE + Rotational KE
1 1
E H = mgh + mv 2 + Iω 2 (1)
2 2
EM = F * d (2)
B – Right foot (calcaneal axial view X-ray)
Figure 1. Calcaneal fracture
138
Proceedings of the 2008 IEMS Conference
A. Transverse view
Figure 4. Stress-strain curve of the
bone
139
Proceedings of the 2008 IEMS Conference
140
Proceedings of the 2008 IEMS Conference
References
[2] www.emedicine.com.
[6] www.sneakerhead.com.
[8]http://en.wikipedia.org/wiki/Image:Stress_v_s
5. Conclusions
train_A36_2.png.
It was the objective of this study to [9] Ratner, B. D., A. S. Hoffman, F. J. Schoen,
determine the optimal thickness of midsole and J. E. Lemons, Biomaterial Science an
for this patient to prevent or limit the Introduction to Materials in Medicine, Elsevier
damages to the heel incurred from a fall. Inc., 2004.
The optimal range was determined to be
between 20mm and 30mm thick. Therefore,
a patient of his size would benefit from these
results. By adding the necessary amount of
material, this would prevent the long term
141
Proceedings of the 2008 IEMS Conference
historical facts, spreadsheets were created to category courses were to be placed but after
help organize the remaining information deemed attempting to categorize several schools,
necessary to accomplish this initiative. First,
twelve categories were designed to thoroughly
evaluate each program while maintaining Exceptions and conflicts arose requiring
emphasis on IE specific courses. Eight of the additional consideration- 1. Schools operating
categories are dedicated solely to industrial on a track basis, requiring students to select a
engineering to allow for a thorough analysis of specialization track early in the program,
programs in terms of the type of IE courses required the creation of a separate curriculum for
being taught and the emphasis each program each track; 2. Institutions operating on a
quarterly system were denoted because these
may be placing on particular IE fields (See
programs tend to require a higher number of
Table 1). Among those, the senior design course credit hours, which will potentially skew the
present in most undergraduate IE curriculums results of some analyses; 3. Distinguish whether
was considered independently because it usually a school is public or private to account for
requires students to utilize an array of IE skills possible differences due to accreditation
to address real world problems. requirements, and political and departmental,
organizational structure.
Table 1. Dedicated IE Categories
4. Database Design
Human Factors Engineering and Ergonomics
Manufacturing (Systems) Engineering
Following the structured, compilation of
Operations Research & Systems Engineering
Production & Management Systems Engineering
necessary information, a relational database was
Quality Management created incorporating all findings. Using
Engineering Management Microsoft Access 2003, a common field
Simulation & Modeling (schoolID) was implemented as the link among
Senior Design all tables, queries, forms, and reports.
Electives were then divided into three This field is automatically generated for
categories to separate IE electives from each school added to the main informational
Engineering & Science electives and other non- table. Other relationships are depicted in Figure
technical electives. The remaining category,
1 below.
General Education, was designed to account for
the mandatory, yet non-IE courses in the
curriculum.
3. Data Gathering
143
Proceedings of the 2008 IEMS Conference
• Course Category Analysis permits the [3] Hamidreza Eskandari, Serge Sala-
input of partial course titles in order to return a Diakanda, Sandra Furterer, Luis Rabelo, Lesia
list of courses taught at each university related to Crumpton-Young, Kent Williams (2007),
"Enhancing the undergraduate industrial
that title. There are many factors available to
engineering curriculum: Defining desired
limit the output yet those are user-defined. characteristics and emerging topics", Education
• Course Analysis allows for analysis + Training, Vol.49, No. 1.
based on a minimal numbers of letters of the
course name to account for variation among [4] Industrial Engineering Programs Accredited
course titles. The type of university can be by the Accreditation Board for Engineering and
selected, and other constraints can be set (i.e. Technology, Inc. Retrieved October 2006, from
whether a user desires institutions that teach or http://www.abet.org.
do not teach a particular course).
[5] Institute of Industrial Engineers, IERC
• Topic Analysis allows for the
Conference, “Reengineering the IE Curriculum,
comparison of topics, whether emerging or not, a Departmental Reform Strategy,” May 2004.
based on the input of any letters describing the
desired, IE concept.
6. Summary
This paper clearly demonstrates how this
initiative was approached, research methods
executed, the creation of the category-based
database and means by which the database can
be updated or edited. After the anticipated
release of the database via the World Wide Web,
institutions will be able to continuously update
the database’s content to ensure the accuracy
and availability of their school’s data and be
permitted to freely use its contents to perform a
wide range of comparative analysis suitable for a
number of purposes.
7. Acknowledgements
145
Proceedings of the 2008 IEMS Conference
146
Proceedings of the 2008 IEMS Conference
Dana Johnston
Department of Children and Families, State of Florida
147
Proceedings of the 2008 IEMS Conference
148
Proceedings of the 2008 IEMS Conference
students was clearly addressed by the course perception that the project was already defined.
instructor and supported by the DCF project This was true in a general context, but not based
champion. on the Six Sigma methodology. It was only
The summary of team findings showed that through strong leadership that the team was able
the combined team was a strength, however to come together and understand what was really
there were some noted areas for improvement. needed in the define phase. The second finding
The strength of the combined approach was that was the level of detail that was needed to
the UCF students had access to DCF and Six perform Six Sigma. The define phase
Sigma Blackbelt experts. Another strength was foreshadowed the depth and breadth of work that
the diverse set of skills brought to the table by was to come.
the UCF students. The measure was of great interest to the
While there were several strengths as team. It was the first opportunity that many of
previously noted, there were some key areas for the UCF team members had to understand the
improvement. The three most critical areas were DCF Tier 3 Review Process and to view and
the following: 1) understanding (willingness to collect data. There were many insights gleaned
learn) of the quality improvement methodology, from this phase, but three were critical. The first
2) developing and executing a clear set of observation was that the team needed to become
communication expectations and requirements, intimate with the process as soon as possible. In
and 3) balancing the project with other work and order to meet this task, a portion of the team
class requirements. Most of these areas for interviewed a focus group of DCF employees
improvement centered on the student and conducted a 16 question survey. The second
experiences, but were also felt on a lesser degree insight was how critical it was to understand the
by the DCF team members. data. In this study, the data that was supplied by
One area of special note was the management DCF consisted of 37 categories. The amount of
of the team when expectations were not met. For data supplied was initially overwhelming.
a three week period, many team members However, as the team better understood the
missed several deadlines for deliverables. As a process, the understanding of the collected data
result, the team leadership revisited the team was also increased. This insight will be further
charter and clarified the expectations and the addressed in the analyze phase. The final insight
documentation that would result if expectations from this phase was the calculation of the
were not met without the proper communication. process sigma. The process sigma is normally
After this clarification meeting, the majority of measured in terms of defect per million
the team performed very well, with some minor opportunities (DPMO); however, the team
exceptions. Unfortunately, during the course of learned early in the data collection process that
the study, a team member was reassigned to a there was a difference in the way quality was
different study for not meeting expectations. measured by the state and federal inspectors
While this was an undesirable event, it was a versus the local operating agency. The local
tremendous lesson to the students that this was agency had a subset of 13 critical categories
more than a purely academic effort. In essence, whereas the state only looked to see if the
the students were hired as integral quality reports reviewed were correct or incorrect. The
improvement experts with an expectation of reduction in the fidelity of the data that was
performance. measured drove the team to evaluate the process
sigma using attribute data. The use of attribute
3.2 DMAIC Process data is what necessitated the measurement based
on defects per million (DPM) instead of DPMO.
While there were numerous lessons learned The use of attribute data required the team to use
in the building aspect of the study, there were several different procedures during the analyze
some critical lessons learned from the phase, specifically in the creation of the process
application of the Six Sigma DMAIC process. control diagrams.
The key findings of the define phase took The analyze phase allowed the team to learn
two forms. The first was driven by the two main lessons. The first was in the area of
149
Proceedings of the 2008 IEMS Conference
Pareto Analyses. After creating an initial set of will increase the visibility of the performance
Pareto diagrams, the team was unable to see a measures to the organization.
tradition 80-20 break point. Based on this
observation, the team decided to prioritize the 4. Results
data categories into three categories. The three
categories were weighted and the Pareto charts The results of this study should be looked at
were repotted. The result is that actual breaks in from several vantage points.
the data could be determined. This enabled the The first-vantage point is from the
team to evaluate the data in a systematic manner perspective of the DCF team members. The
and provide useful information to determine team members were able to closely pair with the
improvements. The second lesson learned in the students from UCF. This arrangement may have
analyze phase was that team observations are initially appeared to be convoluted; however it
critical to understanding a process before truly turned out to be an exceptional
forming options to improve a process. It was relationship. DCF was able to serve as experts
noted by several team members that they could on their processes, while harnessing the
better understand the nature of the process by knowledge, energy, and enthusiasm of doctoral
seeing it in operation. While this finding is quite and master’s level students supported closely by
logical, it was not a priority initially to the team, local area quality experts and mentors.
but once accomplished made the linkages The second-vantage point is that of the UCF
between the data and potential more salient students and faculty. The UCF students were
solutions. able to have a truly experiential learning
The improve phase had one major finding. experience, while implementing the Lean and
The improve phase showed the team that the Six Sigma framework. The UCF team members
previous phases of the DMAIC process learned that they could quickly assimilate the
generated information that was very useful to knowledge of an organization that was initially
proposing improvement ideas. The key result of very different than any previous experiences.
this work was a matrix of 16 improvement ideas The UCF students also learned quickly that the
(reduced from an initial list of 60), that were process would not move forward unless they put
prioritized based on the potential benefits (high in the work to move the process forward and
or low) and the needed resources to accomplish generate the required products. For their efforts,
the improvement (high or low). From the list of the UCF students were certified as Six Sigma
improvements, there were three that could be Green Belts and Black Belts depending on their
immediately implemented into the Tier 3 experience level.
Review Process. The remaining improvements The third-vantage point is that of the results
needed additional approval and resources from produced by the DMAIC process. It can be
DCF. In addition to the list of improvements, the clearly shown that a systematic process led to
team was able to combine the information substantial improvement suggestions. The
gained throughout the process and develop a processes have been successfully used in the
comprehensive set of performance measures that past, so the results of the suggestions once
would be useful to the organization. The key implemented should be no different.
lesson is that following the methodology will As of the writing of this paper, DCF Circuit
generate information that is useful to 18 has implemented four of the sixteen
improvement. recommendations provided the team. The results
After the improve phase, the combined team of the four suggestions are at various stages of
was concluded and the recommendations for execution.
improvement were delivered by the DCF team The recommendations that have been
members to the leadership of DCF. The control implemented have increased the number of cases
measures of this process will depend greatly on under review, added and improved monthly and
what improvements are implemented. Some of quarterly training, and finally established written
the improvement measures will result in guidance in the form of checklist for reviewers
modification to training programs, while other to follow. The short-term effects of these
150
Proceedings of the 2008 IEMS Conference
151
Proceedings of the 2008 IEMS Conference
152
Proceedings of the 2008 IEMS Conference
153
Proceedings of the 2008 IEMS Conference
Design, Implementation, and Maintenance (see and new system desires were determined via
figure 1). traditional methods using interviews with clients
Dean McKinley-Floyd and Mr. Dupree, director
of SBI media and news, during the Analysis
phase. The design phase incorporated
information from the interviews and converted it
into logical and then physical systems
specifications.
3. Communication Strategy
154
Proceedings of the 2008 IEMS Conference
includes updating the equipment and processes The DVD creation Companies can
at the School of Business and Industry, which will require purchase receive a copy of the
includes purchasing equipment necessary for the and inventory of events in which they
improvement of information streamed Recordable DVDs. participated in DVD
throughout the School of Business and Industry. format, which are
The additional cost to hire personnel with more user friendly.
expertise to complete the project must also be Setup and The data in digital
considered. maintenance costs format can be
for the broadcasting broadcast to inform
Both tangible and intangible costs and benefits of the digitized others of the current
have been identified with this project. Tangible events. May need events within and
costs for the new system consist of conversion cooperation from the around the school of
equipment, a sound board, personal computers school of journalism. business.
for the VIC staff to use, an entirely new In order to use the The video and
background set for the TV taping room, and videos and photos to photos can be used
most importantly additional servers that will be market the program, as a marketing tool
solely for the installation of this new information marketing costs to recruit the best
system. Labor needs associated with the staffing would be created. and brightest
for the technical and production side will consist (Flyers, DVDs, students and
of approximately 20-25 people per Mr. Dupree’s magazines, and corporate
request. In the event that the staff selected has articles) professionals.
experience and understanding of both functions
then labor needs will be reduced to about 15-20
students (see Table 1). Intangible costs are a
factor as well. There is a high cost associated
with the improvement and actualization of this
system, not including the cost of information not
reaching students has desired. Another
unexpected cost includes equipment repair
and/or updates. One-time costs that are incurred
at the beginning of the project include the new
hardware and software utilized to house the
system and system training for the students (as
needed).
Costs Benefits
Initial transfer from Digitized
VHS to DVD is audio/video format
timely and the cost can be easily
of the hardware will cataloged and
need be incurred. recovered for future
reference.
Additional people Archived TV
will be required to tapings, Forums, and
maintain the system; Close-ups can be
therefore the cost of utilized as a training
resources will tool for students and
increase. professors alike.
155
Proceedings of the 2008 IEMS Conference
First Objective
• Short-term goals for the first objective
include creating and streamlining an
efficient set of procedures to gather news
and information variety of means to include
SBI administration, staff, students and
faculty, the campus community, as well as
the local community. This information will
be displayed via WSBI, plasma screens, etc.
Individuals requesting information to be
displayed must complete an Information
Request Form from Mr. Dupree or receive
the form via email wsbi@gmail.com or
wbsi@famu.edu upon request. All fields on
the Information Request Form should be
completely filled out and sent/delivered to
Professor Dupree and his aides. Once the
information has been received and reviewed,
it must be organized and formatted such that
it can be stored and then sent out using
various communication methods.
all recorded events will be maintained and is an excellent mechanism to send and
placed on the server for users to access. The receive correspondence, received digital
database will include information such as clips from Alumni ultimately receiving
dates of the recorded events, the name of the information such as scheduling visits, etc. in
company/organization, the presenter, etc. a more efficient manner.
159
Proceedings of the 2008 IEMS Conference
160
Proceedings of the 2008 IEMS Conference
Evans and Lindsay (2005) indicate this holistic quality control programs, one of the
viewpoint was based on three fundamental most recent being Six- sigma.
principles:
Six-sigma is a discipline based on the Greek
1. Focus on customers and stakeholders letter sigma which is the conventional symbol
2. Participation and teamwork by everyone for standard deviation. Six-sigma is a very
in the organization structured process that focuses on quantitative
3. A process focus supported by data to drive decisions and reduce defects
continuous improvement and learning (Benedetto 2003). Many companies have tried to
implement this program without truly
Through Total Quality Management (TQM), understanding the theory and concept behind it.
workers were empowered to provide input Because of such misguided efforts, Six-sigma
throughout the entire process to ensure and has received a somewhat jaded reputation, but it
instill confidence that products met customer is not the discipline that is flawed but rather the
specifications. In meeting the customer application of it. As Evans and Lindsay note, a
requirements, the term quality assurance cookbook approach in which management reads
became widely used as any planned and the latest self-help book advocated by business
systematic activity directed toward providing consultants and blindly follows the author’s
consumers with products of appropriate quality recommendations, will destine Six-sigma or any
along with the confidence that products meet other management practice to failure because
consumers’ requirements (Evans and Lindsay experience only describes what happened and is
2004). This extension of the Continuous no help to the emulating management team
Improvement viewpoint coined the terms Big Q (Evans and Lindsay, 2004). Understanding the
and Little Q, which referred to the quality of subtle theory, on the other hand, will enable one
management and the management of quality, to better understand the cause and effect
respectively (Evans and Lindsay, 2004). Driven relationships that are applied in Six-sigma and
by upper management, TQM focused on not just other management practices. In this paper, we
identifying and eliminating problems that caused will further develop the historical roots of the
defects but educating the workforce to improve quality revolution already illustrated, show how
overall quality. Upper management realized it quality revolution developed into Six-sigma,
needed to focus more on identifying and delve further into the underlying theory of Six-
eliminating problems that were causing the sigma and then analyze the uses of some Six-
defects throughout the process rather than just sigma tools used in an effective, coherent Six-
waiting until the end for post-production sigma program.
inspections. In order to do this, continuous
education programs were implemented at all LITERATURE REVIEW
levels. As time passed, these total quality Multiple scientists have contributed to the
management efforts soon began to fall short of evolution of quality management. The main
expectations of businesses in the United States. three who added to the development of six-
Companies vowed they could promise a certain sigma and will be discussed here are William
level of quality by getting it right the first time, Edwards Deming, Joseph Juran and Philip
but in the end the processes were still not Crosby. Deming had a Ph. D. in physics and was
meeting all needs and requirements of the trained as a statistician at Yale University. He
consumer. TQM was joined by many other and Juran both worked for Western Electric
acronyms like JIT, MRP and TPM that all during the 1920s and 1930s (Evans and Lindsay,
guaranteed an unreachable quality standard 2004). From there he started working with the
(Basu 2001). Because these methods focused so U.S. Census Bureau where he perfected his
greatly on the parts of the whole rather than the skills in statistical quality control methods. He
whole, they fell short in implementing rapid opened his own practice in 1941 and thus began
changes. This led to the introduction of certain teaching SQC methods to engineers and
inspectors in multiple industries. Deming, better
161
Proceedings of the 2008 IEMS Conference
known as the Father of Quality, was widely goals by identifying the customers and their
ignored in the United States for his works. As a needs. Quality control was the process of
result, he went to Japan right after World War II meeting quality goals during operations with
to begin teaching them SQC. His work there is minimal inspection. Quality improvement was
what truly bolstered the “Made in Japan” label the process of breaking through to
to its well-respected level of quality today (Dr. unprecedented levels of performance to produce
W. Edwards Deming, 2007). Dr. Deming’s most the product (Evans and Lindsay, 2004). His
famous work is the 14 points in his System of philosophy required a commitment from top
Profound Knowledge. There are four interrelated management to implement the quality control
parts in the program: Appreciation for a System, techniques and emphasized training initiatives
Understanding of Variation, Theory of for all.
Knowledge and Psychology (Evans and
Lindsay, 2004). In the first point, Deming The third philosopher who contributed greatly to
believed it was poor management to purchase six-sigma was Philip Crosby. Crosby worked as
materials at the lowest price or minimize the the Corporate VP for Quality at International
cost of manufacturing if it were at the expense of Telephone and Telegraph for 14 years. His most
the product. The second and third points well-known work is the book Quality is Free
emphasized the importance of management where he emphasized management should “do it
understanding the project first and foremost right the first time” and popularized the idea of
before taking any steps to reduce variation. The the “cost of poor quality.” Crosby emphasized
methods to reduce variation include technology, that management must first set requirements and
process design and training. Under the fourth those are on which the degree of quality will be
point, Deming stressed understanding the judged. He also believed doing things right the
dynamics of interpersonal relationships because first time to prevent defects would always be
everyone learns in different ways and speeds and cheaper than fixing problems which developed
thus the system should be managed accordingly. into the cost of poor quality idea. Similar to six-
Other points in the Deming school of thought sigma, Crosby focused on zero defects and
were on-the-job training, creating trust, building created four Absolutes of Quality Management
quality into a product or service from the to ensure that goal was accomplished (Skymark
beginning and inclusion of everyone in the Corporation, 2007):
company to accomplish project improvement.
The common methodology used by Deming for 1. Quality is defined as conformance to
improving processes is PDCA which stands for requirements, not as “goodness” or
Plan – Do – Check – Act. “elegance”
2. The system for causing quality is
Joseph Juran is well known for his book, the prevention, not appraisal
Quality Control Handbook published in 1951. 3. The measurement standard must be zero
He too went to Japan in the 1950s after working defects, not that’s close enough
at Western Electric to teach the principles of 4. The measurement of quality is the Price
quality management as they worked to improve of Nonconformance, not indices.
their economy. His school of thought was
different from Deming’s in that he believed to These philosophies have been categorized as
improve quality companies should use systems more evolutionary because the changes resulting
with which managers are already familiar. When from their implementation were more gradual.
consulting firms, he would design programs that As time continued, managers became
fit the company’s existing strategic efforts to increasingly disappointed with TQM and
ensure minimal rejection by staff. The main searched for philosophies that would create
points he taught were known as the Quality more drastic changes in the existing processes
Trilogy: Quality Planning, Quality Control and and systems. These revolutionary processes
Quality Improvement. Quality planning was the recognized that processes are key to quality, that
process of preparing companies to meet quality most processes are poorly designed and
162
Proceedings of the 2008 IEMS Conference
implemented, that overall success is sensitive to revised to a tenfold improvement every two
individual sub-processes success rates and that years, 100-fold every four years and 3.4 defects
rapid, dramatic change requires looking at the per million in five years (Gitlow & Levine,
entire process. One of the first of these 2005). A defect in six-sigma terms is any factor
revolutionary philosophies was Reengineering. that interferes with profitability, cash flow or
Reengineering is defined as the fundamental meeting the customers’ needs and expectations.
rethinking and radical redesign of business This led to the next important concept of six-
processes to achieve dramatic improvements in sigma: critical to quality. Critical-to-Quality
critical, contemporary measures of performance (CTQ) items are points of importance to the
(Benedetto, 2003). It places heavy reliance on customer be it internally or externally (Adomitis
statistics and quantitative analysis. and Samuels, 2003). Six-sigma projects aim to
improve CTQs because research has proven that
Six-sigma emerged during this revolutionary a high number of CTQs correlates with lost
time period after reengineering. The two customers and reduced profitability. Ultimately,
approaches are very similar but also have many the company will have reduced the number of
differences. Whereas reengineering focuses on defects and focused on the CTQs such that it
the statistical aspect, Six-sigma places more will cost more to correct the defects any further
emphasis on data to drive decisions. When than to prevent their occurrence in the first
comparing six-sigma to Total Quality place.
Management it is very apparent that great
changes have come about in the evolution of This thought was advanced in the 1980s by Dr.
quality. Six-sigma has a more structured and Genichi Taguchi with his Quality Loss function.
rigorous training development program for those Prior to that time, it was thought that the
professionals using it. Six-sigma is a program customer would accept anything within the
owned by the business leaders, also known as tolerance range and that product costs do not
champions, while TQM programs are based on depend on the actual amount of variation as long
worker empowerment. Six-sigma is cross as it was still within the tolerance range (Evans
functional and requires a verifiable return on and Lindsay, 2004). Taguchi introduced a model
investment in contrast to TQM’s function or that discredited that school of thought by
process based methodology that has little showing the approximate financial loss for any
financial accountability (Evans and Lindsay, deviation from a specified target value – an idea
2004). that coincides with six-sigma’s inclusion of
financial data in the assessment of a process or
Six-sigma started as a problem solving approach function. His work contradicted the original
to reduce variation in a product and concept and showed that variation is actually a
manufacturing environment. It has since grown measure of poor quality such that the smaller the
to be used in a multitude of other industries and variation is about the nominal level, the better
business areas such as service, healthcare and the quality is. As the graph below depicts, this
research and development. It represents a loss function also applied an economic value to
structured thought process that begins with first variation and defects from the standard. As the
thoroughly understanding the requirements process or project deviates from its target value,
before proceeding or taking any action. Those costs become increasingly higher and the
requirements define the deliverables to be customer becomes increasingly dissatisfied.
produced and the tasks to produce those Quality loss can be minimized by reducing the
deliverables which in turn illustrate the tools to number of deviations in performance or by
be used to complete the tasks and produce the increasing the product’s reliability. In six-sigma,
deliverables (Hambleton, 2007). Six-sigma was the goal is to minimize the number of defects
initiated in the 1980s by Motorola. The company and keep the process as close to the nominal
was looking to focus its quality efforts on value as possible to avoid high degrees of
reducing manufacturing defects by tenfold variability. The concepts of Taguchi are part of
within a five-year period. This goal was then
163
Proceedings of the 2008 IEMS Conference
Six-sigma’s key philosophy of reducing defects Sigma Level Defects per Million
and variation from the norm. Opportunities
One Sigma 690,000
As previously mentioned six-sigma is a product Two Sigma 308,000
of the statistical methods era of controlling Three Sigma 66,800
quality. It stressed the common measure of Four Sigma 6,210
quality, the defect. In measuring output quality, Five Sigma 230
the key metrics used are defects per million Six-sigma 3.4
opportunities (DPMO), Cost of Poor Quality Table 1: Defects per Million Opportunities at
(COPQ) and sigma level (Druley and Rudisill, Various Sigma Levels
2004). Defects per million opportunities takes
into consideration the number of defects One phenomenon that has been widely discussed
produced out of the entire batch and the number in relation to six-sigma is the 1.5 sigma shift.
of opportunities for error. It is important to Because six-sigma focuses on variation, a chart
assess quality performance using the defects per to understand deviation from the mean should be
million opportunities because two different used to further understand this concept. A Z-
processes may have significantly different table shows the standard deviation from the
numbers of opportunities for error, thus making mean. A process’ normal variation, defined as
comparison unbalanced. Furthermore, large process width, is +/- 3-sigma about the mean. In
samples provide for a higher probability of looking at Table 2, it is clear that a quality level
detection of changes in process characteristics of 3-sigma actually corresponds with 2,700
than a smaller sample. When Motorola created defects per million opportunities. This number
this ultimate goal of six-sigma, it was set to be may sound small but it is similar to 22,000
equivalent to a defect level of 3.4 defects per checks deducted from the wrong bank account
million opportunities. According to Evans and each hour or 500 incorrect surgical operations
Lindsay, this figure was chosen because field each week. This level of operating efficiency
failure data suggested that Motorola’s processes was deemed unacceptable by Motorola hence
drifted by this amount on average (Evans and the search for a design that would ensure greater
Lindsay, 2004). The cost of poor quality metric quality levels. A six-sigma design would have
was introduced by Crosby. It includes the costs no more than 3.4 DPMO even if shifted +/-1.5
of lost opportunities, lost resources and all the sigma from the mean. In looking once again at
rework, labor and materials expended up to the Table 2, it is evident a quality level of six-sigma
point of rejection. This metric is used to justify actually corresponds with two defects per billion
the beginning of a six-sigma project. When a opportunities while the more commonly known
company is choosing which project to undertake, value of 3.4 defects per million opportunities is
the project with the highest COPQ will most found at the six-sigma level with a +/-1.5-sigma
likely be selected. Lastly, the sigma level shift off the center value or +/-4.5-sigma. This
indicates the degree of quality for the project. shift is a result of what is called the Long Term
Sigma is a statistical term that measures how Dynamic Mean Variation (Swinney, 2007). In
much a process varies from the nominal or simpler terms, no process is maintained perfectly
perfect target value based on the number of at center at all times, thus allowing drifting over
DPMO. As mentioned, Motorola’s goal was to time. The allowance of a shift in the distribution
have 3.4 defects per million opportunities which is important because it takes into consideration
would happen at the six-sigma level. The both short and long-term factors that could affect
DPMOs for various sigma levels are compared a project such as unexpected errors or
in Table 1 below (iSixSigma, 2002). movement. Also worth noting is that the goal of
3.4 DPMO can be attained at 5-sigma with a +/-
0.5-sigma shift or at 5.5-sigma with a +/-1-
sigma shift. However as Taguchi pointed out, it
is important to minimize the noise factors that
could shift the process dramatically from the
164
Proceedings of the 2008 IEMS Conference
nominal value. Table 2 below shows the DPMO to improving other functions of the business
quality levels achieved for various combinations such as marketing, finance and advertising. The
of off-centering and multiples of sigma (Evans most common methodology is DMAIC. One
and Lindsay, 2004). goal of using the DMAIC methodology is to
identify the root cause of the problem and select
METHODOLOGY the optimal level of the CTQs to best drive the
Within six-sigma there are different project desired output. Another goal is to improve
based methods that can be followed. Among PFQT: Productivity, Financial, Quality and
them are Lean Sigma, Design for Six-sigma, Time spent (Hambleton, 2007). It is used as an
Six-sigma for Marketing and the most common iterative method to combat variation. With the
DMAIC – Define, Measure, Analyze, Improve built in ability to contrast variation, DMAIC
and Control (Hambleton, 2007). Lean Sigma is a intrinsically allows for flexibility because as
more refined version of the standard Six-sigma knowledge is learned thorough implementation,
program that streamlines processes to its assumptions of the root cause may be disproved,
essential value-adding activities. In doing this, requiring the team to modify or revisit
the company aims to do things right the first alternative possibilities. Kaoru Ishikawa
time with minimum to no waste. Wait time, promoted seven basic tools that can be used in
transportation, excess inventory and assessing quality control. The list has since been
overproduction are among the wasteful activities expanded upon to include many other tools but
that can be eliminated. One test used to identify the original seven were Cause-and Effect
the value-add activities that will reduce waste is diagrams, Check sheets, Control charts,
the 3C Litmus Test: Change, Customer, Correct Histograms, Pareto charts, Scatter diagrams and
(Hambleton, 2007). If the activity changes the Stratification (Hambleton, 2007). Some of these
product or service, if the customer cares about tools will be further explained below.
the outcome of the activity and/or if the activity
is executed and produced correctly the first time In the first step Define, the company clearly
then it is a value-add activity and should be kept. defines the problem. It begins the trust and
Lean Sigma was popularized in Japan after dedication from all stakeholders and all persons
World War II by Toyota. The Toyota Production included in the project team. The project team
System challenged Ford’s standard for a large consists of the Champion, Master Black Belt,
lot production system by using systematically Black Belt, Green Belt and Team Members. Key
and continuously reducing waste through small activities occurring in this phase include
lot productions only. Design for Six-sigma selecting the team members and their roles,
(DFSS) expands Six-sigma by taking a developing the problem statement, goals and
preventative approach by designing quality into benefits and developing the milestones and high
the product or process. DFSS focuses on growth level process map (iSixSigma, 2007). One tool
through product development, obtaining new used in this step is the process flowchart. Four
customers and expanding sales into the current main types of flow charts are considered: top-
customer base (Hambleton, 2007). It focuses on down, detailed, work flow diagram and
not just customer satisfaction but customer deployment (Kelly and Sutterfield, 2006). This
success because from the inception of the tool shows how various steps in a process work
product or service the company will focus on together to achieve the ultimate goal. Because it
optimizing CTQs. In order to do this, companies is a pictorial view, a flow chart can be applied to
use multivariable optimization models, design of fit practically any need. The process map allows
experiments, probabalistic simulation the user to gain an understanding of the process
techniques, failure mode and effects analysis and and where potential waste or bottlenecks could
other statistical analysis techniques. This method occur. It also could be used to design the future
is usually begun only after the Six-sigma or desired process. An example process map for
program has been implemented effectively. The a call answering operation is shown below in
Six-sigma for Marketing methodology simply Figure 2.
means applying six-sigma ideas and principles
165
Proceedings of the 2008 IEMS Conference
In the second step Measure, the company If non-linear, a regression analysis can be used
quantifies the problem by understanding the afterwards to measure the strength of the
current performance and collecting necessary relationship between the variables. The various
data to improve all CTQs. Key activities types of scatter plots are shown in Figure 4.
occurring in this phase include defining the
defect, opportunity, unit and cost metrics, The Pareto chart is used to separate the “vital
collecting the data, determining the process few” from the “trivial many” when analyzing a
capability. One tool used in this step is the project. It communicates the 80/20 rule which
SIPOC Diagram. SIPOC stands for Suppliers, states that 80 percent of an activity comes from
Inputs, Processes, Outputs and Customers. This 20 percent of the causes which thus provides
tool is applied to identify all related aspects of rationale for focusing on those vital few
the project – who are the true customers, what activities (Hambleton, 2007). The problem
are their requirements, who supplies inputs to frequencies are plotted in the order of greatest to
the process – at a high level before the project least and show the problems having the most
even begins (iSixSigma, 2007). These diagrams cumulative effect on the system (Kelly and
are very similar to process maps which are also a Sutterfield, 2006). As mentioned earlier, six-
tool applied at this phase of the DMAIC cycle. sigma focuses on eliminating as many noise
A SIPOC diagram is shown below in Figure 3 factors as possible and this Pareto analysis helps
(Simon, 2007). to do just that. So, the company does not expend
resources on wasteful activities that do not add
In the third step Analyze, the root cause of the value, but instead cost. to the end project. Figure
project’s problem is investigated to find out why 5 shows a typical cumulative Pareto chart.
the defects and variation are occurring. Through
this detailed research, the project team can begin In the fourth step Improve, the research of the
to find the areas for improvement. Key activities problem’s root cause is actually put into work by
in this phase are identifying value and non-value eliminating all the defects and reducing the
added activities and determining the vital few or degree of variation. Key activities in this phase
critical-to-quality elements. The care applied to are performing design of experiments,
this phase of the Six-sigma project is very developing potential solutions, assessing failure
important to the project’s success because a lack modes of potential solutions, validating
of understanding and thorough analysis is what hypotheses, and correcting/re-evaluating
causes most defects and variation. Evans and potential solutions. Failure Mode and Effects
Lindsay (2005) described that it could also be Analysis is a tool used in this step of the
caused by: DMAIC process to identify a failure, its mode
and effect through analysis. The analysis
Lack of control over the materials and prioritizes the failures based on severity,
equipment used occurrence and detection. Through this analysis
Lack of training a company can create an action plan if a failure
Poor instrument calibration occurs. Results of an FMEA include a list of
Inadequate environmental characteristics; effects, causes, potential failure modes, potential
Hasty design of parts and assemblies. critical characteristics, documentation of current
controls, requirements for new controls and
Multiple tools exist to guide the efforts of this documentation of the history of improvements.
process. They include but are not limited to Items listed low on the priority list do not
scatter diagrams, Pareto charts, histograms, necessitate an action plan to correct them unless
regression analyses and fishbone diagrams. A they have a high probability of occurrence or are
scatter diagram shows the relationship between special cause circumstances (Hambleton, 2007).
two variables. The correlation is indicated by the
slope of the diagram which could be linear or In the fifth and final step Control, the project
non-linear. If linear the correlation could be improvements are monitored to ensure
direct or indirect and positive, negative or null. sustainability. Key activities include developing
166
Proceedings of the 2008 IEMS Conference
standards and procedures, implementing while revenues and earnings increased by 11 and
statistical process control, determining process 13 percent respectively (Black, Hug and Revere,
capability, verifying benefits, costs, and profit 2004).
growth, and taking corrective action when
necessary to bring the project back to its Six-sigma has grown in application from just in
nominal value (iSixSigma, 2007). The most the manufacturing environment to also banking,
important tool used in this phase is the Control healthcare and automotive. In 2001 Bank of
chart. It is a statistical method based on America’s CEO Ken Lewis began focusing on
continuous monitoring of process variation. The increasing the customer base while improving
chart is drawn using upper and lower control company efficiency. The company handled
limits along with a center or average value line. nearly 200 customer interactions per second and
As long as the points plot within the control thus set the ultimate goal of its six-sigma efforts
limits, the process is assumed to be in control. to be customer delight. Bank of America
The upper and lower control limits are usually established a customer satisfaction goal and
placed three sigma away from the center line created a measurement process to evaluate
because for a normal distribution data points fall current performance in order to work toward
within 3-sigma limits 99.7 percent of the time. improving the state. In the first year, the bank’s
These control limits are different than the defects across electronic channels fell 88
specification limits. Control limits help identify percent, errors in all customer delivery channels
special cause variation and confirm stability and segments dropped 24 percent, problems
while specification limits describe conformance taking more than one day to resolve went down
to customer expectations (Skymark Corporation, 56 percent and new checking accounts had a 174
2007). However, if there are many outliers, percent year over year net gain. Within four
points gravitating toward one of the control years of beginning the project, the bank’s
limits, or there seems to be a trend in the points, customer delight rose 25 percent. Through the
the process may need to be adjusted to reduce application of six-sigma, Bank of America was
the variation and/or restore the process to its able to focus on the voice of the customer in
center. The control chart can help to assess determining what was most important to them to
quantitative gains made through the make their experience a pleasant one (Bossert
improvements because each point will be and Cox, 2004).
compared to the target value. An example of a
control chart for a process in control is shown in The Scottsdale Healthcare facility in Arizona
Figure 6 below. began a six-sigma project to work on its
overcrowded emergency department because it
APPLICATION OF METHODOLOGY took 38 percent of the patient’s total time within
the department to find a bed and transfer the
Multiple success stories have come from patient out of the waiting room. Before
implementing the aforementioned tools into a implementing quality efforts, multiple
six-sigma project. The most well known success intermediary steps existed in the process which
story is that of General Electric under the inevitably slowed down the time from start to
direction of Jack Welch. Welch began a six- finish and reduced the potential yield. As a result
sigma project to work on the rail car repairs and of the DMAIC and Lean Sigma efforts, the
aircraft engine imports for Canadian customers. facility identified the root cause of the problem
Through his guided efforts, the company was not that of finding a bed, as originally
reduced repair time, redesigned its leasing thought, but rather reducing the number of steps
process, reduced border delays and defects and involved in the transfer process. This solution
improved overall customer satisfaction. produced incremental profits of $600,000 and
Quantitatively, General Electric saved over $1 reduced the cycle time for bed control by 10
billion in its first year of six-sigma application percent. Moreover, the patient throughput in the
and then $2 billion in the second year. The emergency room increased by 0.1 patients/hour
company’s operating margin rose to 16.7 percent (Lazarus and Stamps, 2002). This project proved
167
Proceedings of the 2008 IEMS Conference
one of six-sigma’s key arguments that inspection development of a discipline, and the reason(s)
is unproductive and instead quality control that it is better than its predecessors. As
should be implemented from the beginning of a explained, Six-sigma is used to resolve problems
product or service to reduce non-value add resulting in deviation between what should be
activities. happening and what is actually happening. The
methodology and tools of Six-sigma can handle
It is important to note that not all applications of any problem type from conformance to
six-sigma have led to success. A common efficiency to product/service design. However,
explanation for the failures is that companies in order for the program to effectively mitigate
and managers read the latest self-help book and all project risks and be implemented
blindly followed the author’s recommendations. successfully, there must be commitment from
In doing this, they failed to fully grasp the the top-down. Through this paper the history of
theory behind the approach. Experience only quality has been presented. As its definition and
describes programs like six-sigma, but application have evolved from Frederic Taylor’s
understanding the theory will help to understand scientific management theories to the present
the cause-and-effect relationships which can program of six-sigma, and even looking forward
then be used for rational prediction and to more refined programs such as Fit Sigma,
management decisions. Also it is important to quality continues to be a concept of vital
note that many of the failures are not due to a importance for all businesses and industries. In
flaw in six-sigma’s conceptual basis, but rather this paper, the underlying theory and concepts
failures in the mechanics of team operation. contributed by each quality philosopher, and
According to Evans and Lindsay, 60 percent of each revolutionary model were explained to
six-sigma failures are due to the following show how they are similar but more importantly
factors: to show how they are different. For each
company and each industry the same quality
Lack of application of meeting skills control methodology may not be appropriate
Improper use of agendas because each operates under different
Failure to determine meeting roles and circumstances. The benefits that accrue from
responsibilities applying the different techniques of six-sigma to
Lack of setting and keeping ground rules various production situations are invaluable to a
Lack of appropriate facilitative behaviors company truly intent upon quality improvement,
Ambivalence of senior management longevity in its market, and global
Lack of broad participation competitiveness.
Dependence on consensus;
Too broad/narrow initiatives.
168
Proceedings of the 2008 IEMS Conference
169
Proceedings of the 2008 IEMS Conference
Call is not
Returned
170
Proceedings of the 2008 IEMS Conference
Y Y Y
X X X
Negative No Correlation
Positive Correlation Correlation
171
Proceedings of the 2008 IEMS Conference
100 100
90 90
Cumulative Percentage
80 80
Number of Defects
70 70
60 60
50 50
40 40
30 30
20 20
10 10
Measurement
UCL
Process
Mean 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
LCL
Sample Number
Figure 6: Control Chart
172
Proceedings of the 2008 IEMS Conference
173
Proceedings of the 2008 IEMS Conference
with the nozzle and screen close together, and chaotic the flow. When fluid flow is smooth, it is
this point was defined as the jet “fully said to be laminar; and when it is chaotic is is
continuous.” At some point in each experiment, said to be turbulent. Since turbulent flow is
as the screen to nozzle distance was increased, antagonistic to a stable water jet, it is obviously
the water jet became too diffuse to support important to understand the relationship between
electrical current. This point was defined as Reynold’s number and water jet stability.
“fully discontinuous.” The point at which the McCarthy and Molloy (1974) investigated this
stream was 50% percent continuous was then relationship for flow beginning in the laminar
taken as midway between these two readings. region (low Reynold’s numbers), through the
While some of the assumptions in this transition region to fully turbulent flow (high
experiment may be open to question, suffice it to Reynold’s numbers). In this work, however, the
the say that the basic experimental approach is same basic type of nozzle was used for each
sound and conforms with best experimental experiment, and only the diameter of the
practices. The approach attempts to compare all discharge opening was varied. This work
perform all experimental runs under the same resulted in plots of jet break-up length versus jet
conditions so as to eliminate, or at least efflux velocity, or discharge velocity. Rouse, et
minimize, any random effects and interactions al (1952) on the other hand, experimented
from creeping into any single run (Taguchi, extensively with concave nozzles of the types 9
1988). and 10 above in Figure 1, and actually proposed
a design for concave nozzles. Rouse’s design
In this paper, we shall analyze the data from proposal was the basis of the concave nozzles
Theobald’s original experiment, using Taguchi tested by Theobald. The work of Rouse was
methods, to determine whether or not we can extended by Whithead, et al (1951), to high
conclude that one type of shape is better than speed fluid flow in wind tunnel applications.
another in promoting stream stability. This This work resulted in a prototype model, the
would be important to do if we were designing internal contour of which was modified for high
nozzles for commercial use because it would be speed wind tunnel applications.
essential to produce and sell the type of nozzle
with the greatest water jet stability or coherence. An important piece of work in this area of
It would also be important for the contour of investigation was that of Vereshchagin, et al
such a nozzle to be determined at the least (1958), which examined the disintegration of
possible expense. Determining the best type of water jets as a function of jet length for various
nozzle of the three would allow more nozzle diameters. In this experimental work it
experimentation with that best type to determine was found that the distance from the nozzle at
the optimum internal surface shape for which a jet of water becomes discontinuous, viz,
maximizing water jet stability at least cost. becomes incoherent, is relatively invariant out to
some distance from the nozzle, but after which
LITERATURE REVIEW the discontinuity distance has a Gaussian
Since the subject of fire hose nozzle design is distribution. Theobald’s work re-examined this
not a popular area for research, the literature in phenomena using regression analysis and
this area is necessarily sparse. Nonetheless, concluded that the jet discharge maintained
there is some small amount of pioneering approximately 100% coherence out to a distance
literature in this area. A very important area of of approximately 1.75 meters. After this point,
investigation in nozzle design would be that of jet discontinuity became random out to a
the influence of Reynold’s number on water jet distance of approximately 3.5 meters, at which
stability. For the reader who may not be point the jet became discontinuous. It was
acquainted with Fluid Mechanics, Reynold’s further found that the random distance between
number is a measure of the turbulence of 1.75 and 3.5 meters could then be described with
flowing fluids. In general, the lower the a Gaussian or Normal distribution.
Reynold’s number the smoother the flow, and
the greater the Reynold’s number the more
174
Proceedings of the 2008 IEMS Conference
175
Proceedings of the 2008 IEMS Conference
Sided 0.07 0.04 0.05 -0.02 -0.02 -0.04 -0.01 0.00 0.07
Convex 0.24 0.11 0.11 0.07 0.08 0.08 0.08 0.13 0.90 2 2 2 2 2 2
Concave 0.19 0.13 0.09 0.08 0.08 0.08 0.07 0.07 0.78
(0.07) +(0.24) +(0.19) +...+(0.00) +(0.13) +(0.07)
S
V
=
3
− 0.1269
Total 0.50 0.27 0.25 0.13 0.14 0.12 0.14 0.20 1.75
CF = 0.1269 SV = 0.0374
Table 4 – Averaged shape factor data for the three
Now, our purpose is to determine whether any of
nozzle designs
the three shapes is superior to the others in
For this data we obtain a correction factor of … promoting jet stability. In order to do this it is
necessary to construct three pairs of linear,
CF =
[∑ (xij )Str + ∑(xij )Cnvx+ ∑(xij )Cncv]
2 orthogonal contrasts. Each pair of these
contrasts will compare one of the general shapes
24 with the other two to determine whether it is
statistically superior to them. In the following
contrasts, N1, N2 and N3 indicate the straight-
sided, convex and concave nozzles, respectively.
The first pair of contrasts compares the straight-
176
Proceedings of the 2008 IEMS Conference
sided nozzles with the convex and concave exit velocity. This was in fact has been done in
nozzles and is … our analysis for all three sets of linear,
orthogonal contrasts and the results are shown in
N N + N Table 5 below. Again, for the sake of brevity,
L1 ( N ) = 4
1
− 2
8
3
the actual calculations are omitted and the reader
is referred to Taguchi (1988, vol. 1) for the
N − N details.
L (N ) =
2
2
4
3
For the details of calculating the variations of Components of Jet Exit Velocity
these contrasts, the reader may wish to consult Contrasts Linear Quadratic Cubic Quartic
Taguchi (1988). The variations for this pair of
S L1(N) 0.0000 0.0000 0.0000 0.0000
contrasts evaluate to …
S L2(N) 0.0001 0.0001 0.0000 0.0000
SL1(N) = 0.0491 and SL2(N) = 0.0008 S L3(N) 0.0000 0.0000 0.0000 0.0000
S L4(N) 0.0001 0.0001 0.0000 0.0000
The second pair of contrasts compares the S L5(N) 0.0001 0.0001 0.0000 0.0000
convex nozzles with the straight-sided and
concave nozzles and is … S L6(N) 0.0000 0.0000 0.0000 0.0000
N N + N
L (N ) = 4
3
2
− 1
8
3
Table 5 – Interactions of contrasts with jet
velocity components
177
Proceedings of the 2008 IEMS Conference
ST1 = 0.8918 and r-bar = 2.50 1) The jet stability afforded by the straight-
sided design is insignificant compared with that
S T1 of either the convex or the concave designs.
S e2 = r 2) The jet stability afforded by the convex
design is approximately 26% greater than that
0 . 8918
S e2 = for the concave design.
2 .5
3) There are no significant interactions
between any of the contrasts and the components
S e 2 = 0 .3567 of jet velocity.
178
Proceedings of the 2008 IEMS Conference
179
Proceedings of the 2008 IEMS Conference
180
Proceedings of the 2008 IEMS Conference
181
Proceedings of the 2008 IEMS Conference
Robert A. Page
Southern Connecticut State University
pager1@southernct.edu
182
Proceedings of the 2008 IEMS Conference
183
Proceedings of the 2008 IEMS Conference
184
Proceedings of the 2008 IEMS Conference
and Argan (2007) suggest that the audience pays market due to technology such as TiVo and
attention to and accepts brand placement in DVD recorders.
movies and takes celebrities as references when
shopping. However, the movie should not be (8) To promote consumers' attitudes towards
over commercialized. At the same time, the practice of brand placement and the
attitudes toward product placements do not various product placement vehicles
differ based on gender, age, income, or Viewers tend to like product placements as
education. Authors van Reijmersdal, Neijens, long as they add realism to the scene. Snoody
and Smit (2007) have found that as consumers (2006) has found that viewer enjoyment of
watch more episodes, the brand image becomes product placements actually increased for media
more in agreement with the program image. vehicle versions of product placements where
This confirms that learning and human products were an integral part of the script. He
association memory are important to brand conjectures that peoples' lives are so saturated
placement. with brands that the inclusion of identifiable
products adds to the sense of reality, that is,
(6) To bring a change in the audiences' validates the individual's reality. About half of
purchase behaviors and intent respondents said that they would be more likely
Product placements are associated with to buy featured products. People with more
increased purchase intent and sales, particularly fashionable and extroverted lifestyles typically
when products appear in sitcoms, e.g., Ally have more positive attitudes toward product
McBeal in Nick and Nora pajamas, Frasier and placement (Tsai, Liang, and Liu, 2007). Also,
Friends in Starbucks and New World Coffee, product placements are preferred to fictitious
and Cosmopolitan martinis in Sex and the City brands and are understood to be necessary for
(Russell and Stern, 2006; Panda, 2004). In one cost containment in the making of programs and
example, Dairy Queen was featured on The movies (Pokrywczynski, 2005). Also, if brand
Apprentice. The contestants needed to create a image is positive, then consumers' brand
promotional campaign for the Blizzard. During evaluation towards the product placement seems
the week of the broadcast, Blizzard sales were to be more positive (Panda, 2004). Older
up more than 30%. Website hits also were up consumers are more likely to dislike product
significantly on the corporate and Blizzard Fan placements and more likely to consider the
Club sites as well as the Blizzard promotional practice as manipulation (Nelson and McLeod,
site. While DQ had six minutes of screen time, 2005). (Daugherty and Gangadharbatla, 2005)
the overall tone was a little harsh with two
contestants arguing. So, it was not the most USE OF PRODUCT PLACEMENT IN
positive environment for good, old-fashioned SPECIFIC MEDIA
DQ. However, it cost DQ in the "low seven Most product placement is done through
figures" to appear on the show and run its television, film, and video games. However,
supporting promotion. Not bad for a 30% regardless of the media used, the brand's image
increase in sales. (Cebrzynski, 2006) and the content vehicle need to fit in such a way
that the product/brand image will not be harmed
(7) To create favorable practitioners' views and that attention will be brought to the product
on brand placement or brand (Cebrzynski, 2006). Also, because
Practitioners' views on product placement advertisers continue to look for ways to stay in
generally are favorable or else the product touch with consumers, they easily could follow
placement market would not continue to their audiences into less-regulated media such as
increase. Practitioners remain positive about the Internet and 3-G mobile phones. As web-
product placements as long as no harm is done, connected television becomes a practical reality,
sales and brand image go up, and consumers are a user-driven environment and peer-to-peer file
positive about the product and brand. Also, swapping is being reinforced. While new
product placements help the practitioner make platforms such as 3G and MPEG-4 create
up for an increasingly fragmented broadcast greater opportunities for interactivity, successful
185
Proceedings of the 2008 IEMS Conference
product placement still must be relevant to its positive attitude toward this form of marketing
host content. (New Media Age, August 11, communication, feeling it is preferable to
2005) commercials shown on the screen before the
movie. (d'Astous and Sequin, 1999) More
Television frequent viewers and viewers who enjoy the
Television viewing is complicated with the movie more, pay attention to product placements
use of zipping, zapping, TiVo, and DVRs. That in the movie (Argan, Veliogly, and Argan,
is, the audience can shift the channel, change the 2007).
program, and slow down or fast-forward the Shapiro (1993) has classified four types of
program to avoid advertising. In addition, media product placements in movies: (1) provides
clutter, similarity of programming across only clear visibility of the product or brand
channels, and channel switching behavior all being shown without verbal reference, e.g., a
compound the advertising effectiveness of bottle of Coca-Cola sitting on the counter; (2)
television. (Panda, 2004) As a result, top-rated used in a scene without verbal reference, e.g.,
television shows are not necessarily the best actor drinks a Coca-Cola but does not mention
places for product placements. Product anything about it; (3) has a spoken reference,
placements depend on a number of factors, e.g., "Boy, I'm thirsty for a Coke"; and (4)
including length of the time on air, when and provides brand in use and is mentioned by a
how products are woven into the story line, and main star, e.g., actor says "This Coke tastes so
targeted audience. (Friedman, 2003) refreshing" while drinking the Coke. The star
Plot connection (Russell, 1998, 2002; using and speaking about the brand in the film is
Holbrook and Grayson, 1996) is the degree to assumed to have higher impact than the mere
which the brand is woven into the plot of the visual display of the brand. That is, meaningful
story. Lower plot placements do not contribute stimuli become more integrated into a person's
much to the story. Higher plot placements cognitive structure and are processed deeply and
comprise a major thematic element. Essentially, generate greater recall. (Panda, 2004)
verbally mentioned brand names that contribute
to the narrative structure of the plot need to be Video Games
highly connected to the plot. Lower plot visual Products and brands are expanding into video
placements need to serve an accessory role to games and even creating their own games.
the story that is lower in plot connection. Visual Essentially, there are three general approaches to
placements need to be lower in plot connections, game advertising: (1) traditional product
and audio and visual placements need to be even placements, signs, and billboard ads that are just
higher in plot connection. (Panda, 2004) Also, in the games from the beginning and cannot be
prominent brand placements in television have a changed later, (2) dynamic advertising wherein
more significant advantage than subtle brand new ads can be inserted at anytime via the
placements (Lord and Gupta, 1998). Internet, and (3) advergames or rebranded
versions of current games that blatantly promote
Film a single product throughout. Until now,
What is the effect of Tom Cruise chewing advergames and product placements have been
Hollywood chewing gum or Agent 007 using a the leading forms of in-game advertising. For
BMW? These are typical examples of product example, Burger King created three games
placement in movies. Higher involvement is suitable for the whole family: racing, action,
required to view a movie than for viewing and adventure, featuring The King, Subservient
television. Television viewers can multi-task in Chicken, and Brooke Burke. Their target market
the home setting thereby reducing their attention of young males meshes well with the Xbox
span and brand retention. Moviegoers actively audience, given that 18-34 year old men have
choose the experience, movie, time, and cost. been a hard group for marketers to get their
As such, they are much more receptive to the product or name in front of. (Cebrzynski, 2006;
brand communication during the movie. (Panda, New Media Age, October 27, 2005)
2004) A majority of movie watchers have a
186
Proceedings of the 2008 IEMS Conference
However, dynamic advertising is taking off and individual factors influence processing depth,
is probably the wave of the future. In dynamic which in turn impacts message outcomes. While
advertising, a marketer can specify where ads this model still is relatively untested, it does
are put, can set times when ads will run, can make sense and provides an integrative
choose which audience type your ad goes to, and framework. In the interim, until it is more
can get all the tracking available for Internet ads. universally tested and accepted, the variables
Approximately 62% of gamers are playing that are discussed consistently in the literature
online at least some of the time. Researchers are expanded below:
can track how long each ad is on the screen, how
much of the screen the ad occupies and from Visual/Audio/Combined Audio-Visual
what angle the gamer is viewing the ad. Product placements can be visual only, audio
Companies pay for ad impressions – one ad only, or combined audio-visual. Most product
impression constitutes 10 seconds of screen placements are visual which involves
time. About half of video games are suitable for demonstration of a product, brand, or visual
ads and could be contextually relevant to the brand identifier with no sound. In the audio
game. More than 50 games already are product placement, there is audio placement but
receiving dynamic ad content, with another 70 the brand is not shown. However, visual only or
set to go by year-end. (Hartley, 2007) audio only may not get noticed. About one in 10
Another wave of the future is 3D ads that are brands on television occurred as plugs verbally
twice as powerful as billboards. Also, the latest delivered by an on-camera personality. Visual
version of digital video standard MPEG-4 offers product placements last an average of 6.2
the possibility to personalize storylines or even seconds with the majority (68%) lasting for 5
hyperlink from tagged content, so viewers can seconds or less. The verbal mentions lasted 5.5
click on objects they want to buy. Whatever the seconds on average. For consumers to form
future brings, advertising and product association with the product placement, the
placements need to fit with the game. That is, brands need to be on screen longer or with more
video game developers need to incorporate impact. The combined hybrid strategy of audio-
advertising in an ambient way that will not visual shows the product or brand and also
distract players. In addition, the center of the mentions it in the medium. Only about 3.1% of
screen gets the most attention based on eye- product placements in sitcoms/dramas were both
tracking research. (New Media Age, October 27, visual and auditory. Recall is enhanced from
2005) dual-modality processing. (La Ferle and
Edwards, 2006) This dual mode requires
VARIABLES THAT IMPACT THE creativity and greater cost so that it does not
EFFECTIVENESS OF PRODUCT interfere with the natural flow of the program.
PLACEMENT (Argan, Velioglu, and Argan, 2007)
There is no universal agreement about what
makes product placements effective. However, Primary Product Placement Strategies
Balasubramanian, Karrh, and Patwardhan (2006) There are three primary product placement
have developed a useful model comprised of strategies (d'Astous and Sequin, 1999; Panda,
four components: (1) execution/stimulus 2004):
factors such as program type, execution 1. Implicit product placement strategy: The
flexibility, opportunity to process, placement brand, logo, the firm, or the product is
modality, placement priming; (2) individual- presented passively with only clear visibility
specific factors such as brand familiarity, within the program without being expressed
judgment of placement fit, attitudes toward formally. This product placement is more
placements, involvement/connectedness with contextual or part of the background with no
program; (3) processing depth or degree of clear demonstration of product benefits, e.g.,
conscious processing; and (4) message wearing clothes with the sponsor's name or a
outcomes that reflect placement effectiveness. scene in front of the Gap. A second type of
According to the authors, execution and implicit product placement strategy is when
187
Proceedings of the 2008 IEMS Conference
the product is used in a scene but no spoken uses may alienate current segments. (d'Astous
attention is given to the product. and Sequin, 1999) In general, however, greater
2. Integrated explicit product placement interplay between the characters and the brand
strategy: In this strategy, the brand, logo, makes the most successful product placement
the firm, or the product plays an active role (La Ferle and Edwards, 2006).
in the scene and is expressed formally
within the program or plot, e.g., Domino's Type of Television or Media Program
pizza delivered in a scene where everybody Consumers' reactions toward a product
is starving and actually eats it. That is, the placement may be impacted by the type of
attribute and benefits are demonstrated television program. There are four general types
clearly and mentioned by the main star. In of television programming: (1) quiz/variety
general, explicit product placements are shows wherein the need for entertainment is
more effective than implicit placements maximized, (2) mini-series/dramas which
(Panda, 2004). satisfy the need to identify oneself with
3. Non-integrated explicit product placement characters, (3) information/services
strategy: In this strategy, the brand, logo, programming which capitalizes on the need for
the firm, or the product is formally information, and (4) sports and cultural events
expressed but not integrated into the wherein the sponsoring is more linked to the
contents of the program, e.g., the program event than it is to the program. Consumer
was sponsored by Toyota or the Hallmark evaluations are most negative when a product
movie by Hallmark. This type of reference placement occurs in mini-series/dramas.
normally is included in the sponsorship deal. (d'Astous and Sequin, 1999) In general, success
of the product placement is dependent on the
Brand/Sponsor Image success of the programming content (Panda,
An expected result of television sponsorship 2004).
is the transfer of the program's image to the On the other hand, 18-34 year-old men spend
sponsor. As such, a product placement needs to little or no time watching TV, but they do like
be associated with programs and stars that are gaming. Unpaid product placements in games
congruent with the sponsoring identity. have been around for more than 10 years. In the
However, a more positive image of the sponsor past, the limitation has been the consoles, but
does not lead to significantly better consumer new consoles are launching that have Internet
reactions to product placements. (d'Astous and connectivity built in. So, product placements in
Sequin, 1999) But, higher levels of involvement video games are expected to increase and
with the movie or program have a greater impact improve. Predictions indicate that dynamic in-
potential. Also, on-set brand placement can game product placements are taking off and will
experience success regardless of the viewer continue to do so. (New Media Age, October 27,
involvement with the segment. (Pokrywczynski, 2005)
2005)
Degree of Viewers' Parasocial Relationship or
Sponsor-Program Congruity Attachment with the Characters
A strong sponsor-program congruity suggests Character attachment is a function of three
that the sponsor's products and activities are variables or stages: (1) inside-program
clearly related to the contents of the program, character-product relation, (2) outside-program
i.e., the product placement is likely to be natural consumer-character relation, and (3) interaction
and consistent with the program. However, between inside and outside influences in the
when sponsor-program congruity is weak, the consumer-product attitude. As a general
product placement may be seen as inconsistent statement, consumers align their attitudes toward
and not credible. For example, if a sponsor has a product with the inside-program characters'
a product that is placed in a reality show attitudes toward the product. This alignment
competition, the product may or may not be used process is driven by the consumers' extra-
as the sponsor intends it to be used, and the new program attachment to the characters. Or, the
188
Proceedings of the 2008 IEMS Conference
consumer's attitude toward the product is a The first downside to using product
function of the valence and strength of the placements is that marketers may have a lack of
character-product association in the program and control over how products are portrayed or
the valence and strength of the consumer- incorporated into a scene or storyline
character attachment. As such, if a consumer (Daugherty and Gangadharbatla, 2005).
experiences a strong parasocial relationship or Products may end up being misused, ignored,
alignment with the television character, then this criticized, associated with questionable values,
will contribute to the formation and change of or used unethically. This can be especially true
consumers' attitudes toward consumption and in reality shows. So, advertisers must exert
attitude alignment with the product. (Russell greater control over product or brand
and Stern, 2006; Russell, Norman, and Heckler, appearances to ensure their prominence. Most
2004) appearances are visual or verbal but rarely both.
In addition, most brands are portrayed neutrally
Degree of Clutter around the Product and for less than five seconds. (La Ferle and
Placement Edwards, 2006)
Product placement is impacted by how much The second downside to using product
other product placement there is in the show, placements is that marketers have little if any
and product placements are prevalent with one influence over how successful media
brand appearing in every three minutes of programming will be. That is, it is difficult to
programming. Because there is so much clutter, predict where to place brands for maximum
the value of any one brand may be limited. positive exposure. Also, if one places too many
Hence, this cluttered promotional environment product placements, then consumers may feel
and product placements must be managed that they have had enough and the saturation
actively. Game shows offer the greatest may have a negative effect. Product placement
exposure and storied programming offers the cannot be so overwhelming that it detracts from
fewest and least prominent product or brand the television show, movie, or content vehicle.
appearances. (La Ferle and Edwards, 2006) (Cebrynski, 2006)
Too much product placement can annoy viewers A third downside to using product
and be viewed as unacceptable commercialism. placements is the possibility of negative
However, communication interference from character association. If the product is
competitors can be limited. This occurs because associated with a particular character, then when
the sponsor often buys a good portion of the that character falls from grace or does something
commercial time within a program. When the inappropriate, the product or brand may be
product placement is integrated into the tarnished as well. That is, the target audience
program, the realism is enhanced and the may shift its attitude about a character in such a
likelihood of viewer zapping due to clutter is way that products associated with the character
reduced. (Daugherty and Gangadharbatla, 2005; also will be diminished.
d'Astous and Seguin, 1999) In general, other The fourth downside to using product
competing products in the product category placements is the difficulty in pricing product
should not be incorporated into the placement (The Economist, 2005) Typical
programming (Panda, 2004). placement fees are based on a fairly standard
scale of expected audience size for the media
DOWNSIDE OF USING PRODUCT vehicle. This pricing method assumes that
PLACEMENT product placement exposure is equal across
There are several downsides to using product events and scenes and equal to the exposure
placements: (1) lack of control, (2) media value marketers get from a 30-second
programming may not be successful, (3) commercial. However, evidence exists
possibility of negative character association, (4) supporting the premise that how and when the
difficulty in pricing product placements, and (5) products appear in the media vehicle may be
product placement ethics. more important in determining cost and value.
(Pokrywczynski, 2005)
189
Proceedings of the 2008 IEMS Conference
With product placements continuing to grow placements (e.g., brand research, purchase
in the TiVo era, businesses are having increasing intent, and word-of-mouth) and can multiply the
pressure to show a return on investment for that quantity of product placements by 4-6 times
activity. As a result, marketers are becoming without compromising the production. For
more sophisticated about measuring product example, they found that film and television
placements. Several firms offer tools that rate product placements both generate an average
product placements in terms of viewer impact response and action rate of 25-30%; hence,
and dollars and cents. For example, Nielsen is converting a product placement into consumer
creating a tool that would produce product action is remarkably high and measurable.
placement ratings. Even though Nielsen has (Song, 2007)
tracked 100,000 product placements since the One last downside of product placement is
fall of 2003, it is not easy to get product that of product placement ethics. In particular,
placement data to assess the effectiveness and some individuals question the ethics of product
price of product placements. However, some placement. Product placement ethics are
means are available. For example, weekly discussed below.
product placement ratings can be issued to
subscribers. Or, a weekly summary of top- ETHICS OF PRODUCT PLACEMENT
ranked product placements is available in the Many consumers and researchers consider
Wall Street Journal or other publications that product placements as excessive
assess some sweeps-related data. (McClellan, commercialization of the media and an intrusion
2003) And, at least a dozen other firms have into the life of the viewer. The viewer does not
established unique methods of measuring go to the movie or television to see the product
product placements. Some of the variables that placement. Rather, viewers often attend movies
are tracked in various measurement systems and television to escape the realities of life. As
include: placement foreground or background, such, marketers must make the product
oral or visual, main character or sidekick, how placement look obvious at the point of
much air time compared to competition, viewer emergence. Sometimes, the product or brand
response, awareness scale, product placement to placement is weaved into the plot, like ET's
commercial cost ratio, degree of integration, Reese's Pieces with a 65% increase in sales due
whether or not it does harm, "fit" scores, object to this placement (Balasubramanian, Karrh, and
or purpose of placement, reach, credible use of Patwardhan, 2006). If the plot connection and
product placement, and meeting of long-term the reflection of the user or character in the film
strategies and objectives. (Wasserman, 2005) are missing, the product placement often is
Regardless of which measurement system and futile. Hence, relevance of the product to the
variables are used, there will continue to be situation needs to be created by possibly
differing opinions over the value of product incorporating the placement planning at the
placements and how to measure that value. In script level. (Panda, 2004)
addition, the concern is that once marketers Product placement caters to a captive
quantify product placement value, then product audience even though the brands are shown in
placement will become a standardized their natural environment. But, viewers
commodity. generally tend to like product placements unless
One example of a company releasing ground there are too many. Product placements
breaking research for product placement typically are seen as an acceptable practice,
measurement is Visure Corp., a privately-held frank, amusing, pleasant, and dynamic. In
company based in Richmond, VA. Visure Corp. general, viewers consider product placements to
offers innovative, web-based cataloging and enhance realism, aid in character development,
measurement services that help marketers justify create historical subtext, and provide a sense of
and maximize product placement effectiveness. familiarity. However, if the audience realizes
The company's integrated product placement that the product placement was placed there, it
solution has the ability to measure what may affect their judgment and they may counter
consumers are doing in response to product argue the placed messages. (Panda, 2004)
190
Proceedings of the 2008 IEMS Conference
Still, some individuals feel that product mix branded mentions into their stories
placements are sinister and should be banned or (Mandese, 2006).
at least clearly disclosed in the credits at the end An additional area of ethical controversy is
of the program (The Economist, 2005). This is pharmaceutical product placements. The FDA
particularly true for implicit product placements requires that pharmaceutical companies give
that should be avoided since they are perceived both the benefits and the side effects of their
as less ethical than the other types of product drugs. This clearly will not work for product
placements, particularly if they appear in an placements. However, the FDA does allow
information/series television program. Also, reminder ads that only mention the brand with
consumers' ethical opinions about product no other information or claims. The FDA
placements differ significantly across product currently is reviewing all promotional activity
categories, with greater concern for ethically following calls by medical groups to ban direct-
controversial products such as alcohol, to-consumer drug ads. It is not clear how
cigarettes, and guns. (d'Astous and Sequin, product placements will fit into the review.
1999) (Edwards, 2005) The drug industry association
Toys are likely products for product PhRMA has published guiding principles for
placement. But, the use of toy placements is advertising prescription medicines: the industry
very low, little better than chance. However, should adapt its television ads to what is
children's programming is highly regulated by appropriate for the audience, health and disease
the FCC and much product placement in awareness should be part of the promotion,
children's shows is essentially banned. That is, a products should be discussed with doctors
2004 APA finding states that kids under 8 have before drug campaigns are launched, and low-
difficulty distinguishing ad content from income groups should be provided with
program content. However, the Children's information about health assistance programs.
Advertising Review Unit makes voluntary rules (Business Monitor International, 2007)
regarding advertising to children, and companies
typically honor these rules. While some SUMMARY
companies have product placement strategies or Product placement has become an
policies, television generally offers few venues increasingly popular way of reaching potential
for kids and toy product placements. As an customers who are able to zap past commercials.
interesting research note, 3-5 year olds were To reach these retreating audiences, advertisers
asked to sample identical fast food. One bag use product placements in more clever, effective
had a McDonald's golden arches logo while the ways that do not cost too much. The result is
other bag was unmarked. With exactly the same that the average consumer is exposed to 3,000
food and only a different bag, children preferred brands a day including even billboards, T-shirts,
the branded version. Even young kids are tattoos, schools, doctor's office, ski hills, and
swayed by advertising and brand placements. sandy beaches (Nelson and McLeod, 2005). A
(Business Wire, 2007; Auty and Lewis, 2004; wise caveat to consider for product and brand
Edwards, 2006) placement is "Our philosophy is if the brand
Newspapers also are under ethical scrutiny. doesn't make the show better, the brand doesn't
An important ethical issue is whether or not make the show. People must not notice the
newspapers should be allowed to use product integration, but they must remember it. That's
placements. While they are under pressure to do the test." (Stringer, 2006, p. 1) The ideal
so, they have held out so far to not blur the line product placement situation is win-win-win-win:
between content and ads. (The Economist, customer gets to know about new and
2005) The fear is that if product placements established products and their benefits, client
seep into newspapers and news magazines, then gets relatively inexpensive branding of their
editorial content may be jeopardized and product, media vehicle gets a brand for free or
credibility violated (Callahan, 2004). Still, rigid can reduce its production budget, and the
editorial content policies appear to be easing up product placement agency gets paid for bringing
as marketers and agencies pressure publishers to the parties together.
191
Proceedings of the 2008 IEMS Conference
192
Proceedings of the 2008 IEMS Conference
193
Proceedings of the 2008 IEMS Conference
Abstract
2. The theoretical framework
The role of customers’ advices as innovation
consultants offers valuable experiences to 2.1. The theoretical framework of
executives in providing data and strategies to marketing public relations
enhance a corporation’s creativities. This
study aims to provide a quantitative and Integrated Marketing Communications
qualitative survey about how consumers (IMC), according to The American Marketing
evaluate this new concept of customer based Association, is “a planning process designed
marketing public relations trend with Turkish to assure that all brand contacts received by a
innovation cases selected according “ The customer or prospect for a product, service or
most Successful Turkish Brands in 2006”. organization are relevant to that person and
consistent over time”. The definitions of the
1. Introduction IMC concept are many and varied. IMC
concept first emerged in the 1980s and
Marketing focuses on consumers, as in developed relatively slowly into the middle of
consumer relations, which is part of public the next decade. Smith et al. (1997) argued for
relations. Traditional consumer relations three different definitions of IMC, each with
tactics, such as product oriented, can work common elements. These grouped definitions
hand in hand with marketing tactics such as are as follows:[2]
innovation strategies. 1- those that refer to all marketing
New product ideas can come from many communications
sources: customers, scientists, employees, 2- a strategic management process based
competitors, channel members and top on economy, efficiency and
management. effectiveness
Today the new business environment is 3- the ability to be applied within any
providing possibility to the new customer type of organization.
based concepts instead of classical marketing The marketing communications industry
concepts. Corporations that achieve high often uses terms such as advertising, public
customer retention and high customer relations, sales-promotion, direct mail or
profitability aim for: the right product (or sponsorship to describe what they do. The
service), to the right customer, at the right relationship between the public relations and
place, at the right time, through the right marketing functions has always been a
channel, to satisfy the customer’s need or somewhat ambiguous and controversial one.
desire[1]. Current economic trends and the The three main pillars of IMC are public
impact of the technology in the new business relations, advertising and marketing. Although
life put the end to the product based these three disciplines have much in common,
management strategy. Companies grow and much also differentiates them. Definitions for
prosper when they efficiently meet the needs each disciplines are as follows: [3]
of customers. As the world moves into the new Advertising is the use of controlled media
millennium four particular changes are (media in which one pays for the privilege of
reshaping the environment of business: the dictating message content, placement, and
globalization of markets, changing industrial frequency) in an attempt to influence the
structures, the information revolution and actions of targeting publics.
rising consumer expectations. Marketing is the process of researching,
creating, refining and promoting a product or
194
Proceedings of the 2008 IEMS Conference
195
Proceedings of the 2008 IEMS Conference
Table 1: 4 C in Marketing [7] The new product development can take two
4P 4C forms. The company can develop new
Product Customer needs & wants products in its own laboratories. Or it can
contract with independent researchers or new
Price Cost (to the customer) product development firms to develop specific
products for the company.[7]
Place Convenience
Nowadays, we can talk about another new
Promotion Communication marketing management trend in which
customer is considered as a business partner in
corporate innovation strategies.
The consulting firm Booz, Allen & Hamilton
2.3.Corporate innovation strategy and has identified six categories of new products in
corporate citizenship terms of their newness to the company and to
the marketplace: [10]
A significant boost to corporate citizenship z New to the world products: New products
initiatives was given in 1996 when President that create an entirely new market.
Clinton called to Washington a group of z New product lines: New products that
leading business people to discuss the notion allow a company to enter an established
of corporate citizenship and social market for the first time.
responsibility. President encouraged the z Additions to existing product lines: New
business leaders “ to do well by their products that supplement a company’s
employees as they make money for their established product lines as package
shareholders”[8]. According to Carroll sizes, flavours, etc.
corporate citizenship has four faces: an z Improvements and revisions of existing
economic face, a legal face, an ethical face and products: New products that provide
a philanthropic face [9]. Good corporate improved performance or greater
citizens are expected to: perceived value and replace existing
z Be profitable (carry their own weight products.
or fulfill their economic z Repositioning: Existing products that are
responsibility) targeted to new markets or market
z Obey the law (fulfill their legal segments.
responsibilities) z Cost reductions: New products that
z Engage in ethical behavior (be provide similar performance at lower
responsive to their ethical cost.
responsibilities) Their existing products are vulnerable to
z Give back through philanthropy changing consumer needs and tastes, new
(engage in corporate contributions). technologies, shortened product life cycles,
Corporate citizenship addresses the and increased domestic and foreign
relationship between companies and all their competition.
important stakeholders, not just employees. New products may sometimes be failed.
Once a company has carefully segmented the Several factors may be responsible [11]:
market, chosen its target customer groups, 1-A high level executive may push a favourite
identified their needs, and determined its idea through in spite of negative market
desired market positioning, it is ready to research findings.
develop and launch appropriate new products. 2-The idea may be good, but the market size is
Marketing management plays a key role in the overestimated.
new product development process. 3-The actual product is not well designed.
Every company must carry on new product 4-The new product is incorrectly positioned in
development. Replacement products must be the market, not advertised effectively or
created to maintain or build sales. overpriced.
Furthermore, customers want new products, 5-Development costs are higher than expected.
and competitors will do their best to supply 6-Competitors fight back harder than
them. expected.
That’s why the role of customers’ advices as
innovation consultants offers valuable
196
Proceedings of the 2008 IEMS Conference
experiences to executives in providing data Arzum General Manager Murat Kolbaşı said
and strategies to enhance a corporation’s that: their market share increased from 0% to
creativity. This new trend should be one of 20% with their product entitled “Arzum
the key success factors. Ehlikeyif Kettle”. He informed that this
Because, a well defined concept prior to product was a new product idea came from
development, where the company carefully their customers who drink often tea and who
defined and assessed its target market and would like to drink Turkish coffee too with the
product requirements assure many benefits. same apparatus.
Kotler affirms that: “The teamwork aspect is Gencturkcell, one of the Turkish biggest
particularly important. New product GSM operator’s brand, for example, conducts
development is most effective when there is a its business projects with young people
team work among research&development, through the “gnçtrkcll club” in “The Voice of
engineering, manufacturing, purchasing, Youth” project. 13 millions members of this
marketing, finance and customer relationship “gnçtrkcll club” help to manage organizational
from the beginning of the innovation process. inovation strategies. Turkcell CRM Based
The product idea must be researched from a Marketing Department Chief Burak Sevilengül
marketing point of view and a specific cross- said that: in order to understand better their
functional team must guide the project target audience, they collaborated with 150
throughout its development.” [7] p.309 young people (the average age is between 16-
Successful new product development 25). He added that the “lovebird” campaign is
requires the company to establish an effective designed with the “gnçtrkcll club”s members’
organization for managing this innovative comments and obtained a great success.
process. Top management is very accountable Companies also rely on their scientists,
for the success of new products. It should engineers, designers and other employees for
define the business domains, product new product ideas. Successful companies have
categories and customer feed-backs that the established a company culture that encourages
company wants to emphasize. every employee to seek new ways of
improving the company’s production, products
3. New product development decision and services.
and development process Akbank, one of the Turkish bank, for
example, evaluated its cusotmers’ feed-backs
New product ideas can come from many about their credit demands. Customers would
sources: customers, scientists, employees, like to obtain bank credit without going to the
competitors, channel members and top bank’s agencies, without waiting for a
management. longtime in the agency and without
The marketing concept holds that customers’ completing long forms. Thus, the marketing
needs and wants are the logical place to start and R&D managers have worked together to
the search for innovation ideas. Hippel has create a fast and easy asking for credit way.
shown that the highest percentage of ideas for They founded “mobile credit” which allows to
new industrial products originate with customers to make their credit demands using
customers [12]. their handy phone. Akbank Individual Bank
Companies can identify customers’ needs Operations Marketing Department Chief Cem
and wants through surveys, projective tests, Muratoğlu said that CRM based strategies
focused group discussion, suggestion and have an important role on the concurrent
complaint letters from customers. Many of the banks’ successes. Our “mobile credit”
best ideas come from asking customers to application desgined by our customers’ feed-
describe their problems with current products. back had a great success. 700.000 people
Arzum, one of the Turkish small electric addressed to “Akbank mobile credit” only in
white goods producer, for example, asks 2006.
buyers at big points of purchase, what they Companies can also find good ideas by
like and dislike about their products, what examining their competitors’ products and
improvements could be made. Then they services. They can learn from distributors,
evaluate these demands to develop new suppliers and sales representatives what
products or to modify the existing ones. competitors are doing. They are often the first
to learn about competitive developments.
197
Proceedings of the 2008 IEMS Conference
They can find out what customers like and Adopters of new products have been
dislike in their competitors’ new products. observed to move through the following five
If the product concept passes the business stages:
test before the evaluation of different new z Awareness: The consumer becomes
product ideas sources, it moves to R&D and/or aware of the innovation but lacks
engineering to be developed into a physical information about it.
product. These departments will develop one z Interest: The consumer is stimulated to
or more physical versions of the product seek information about the innovation.
concept. Its goal is to find a prototype that the z Evaluation: The consumer considers
consumers see embodying the essential whether to try the innovation.
elements described in the product-concept z Trial: The consumer tries the innovation
statement that performs safely under normal to improve his or her estimate of its
use and conditions. value.
Producers and R&D people must not only z Adoption: The consumer decides to make
design the product’s required functional full and regular use of innovation.
characteristics but also know how to The companies seek to estimate some
communicate its psychological aspects variables as trial, first repeat, and adoption and
through physical issues. How will consumers purchase frequency of consumer products.
react to different colours, sizes, weights and Therefore they conduct consumer goods
other physical characteristics? Marketers need market testing. It’s true that many of the best
to supply producers with information on what ideas come from asking customers to describe
attributes consumers seek and how consumers their problems with current products as users
judge whether these attributes are present. of these. In this context, Rogers sees the five
When the prototypes are ready, they must be adopter groups as differing in their value
put through functional and consumer tests orientations: [8] Innovators are willing to try
whose are business partner in corporate new ideas at some risk. Early adopters are
innovation strategies, now. Consumer testing guided by respect; they are opinion leaders in
can take a variety of forms, from bringing their community and adopt new ideas early but
consumers into a laboratory to giving them carefully. The early majority are deliberate;
samples to use in their homes.[7] p.327 they adopt new ideas before the average
person, although they rarely are leaders. The
3.1. The consumer adoption process late majority are sceptical; they adopt an
innovation only after a majority of people have
In this study, we asked: “How do final and tried it. Finally, laggards are traditional; they
potential customers learn about new products, are suspicious of changes, mix with other
try them and adopt or reject them?” traditional people. They adopt the innovation
Management must understand this consumer only when it takes on a measure of tradition
adoption process to build an effective strategy itself.
for market share. Kotler affirms that:
“Adoption is an individual’s decision to 4. Methodology
become a regular use of a product”. [7] p.335
The consumer adoption process is later The purpose of this study is to examine the
followed by the consumer loyalty process, following questions:
which is the concern of the established What are the main stages in customer
producer. oriented developing new products, and how
An innovation refers to any good, service or can they be managed better?
idea that is perceived by someone as new.[13] What factors affect the rate of consumer
The idea may have a long history, but it is an adoption of newly launched?
innovation to the person who sees it as new. This paper aims to move on to examine the
Rogers defines the innovation diffusion marketing public relations literature and to
process as “the spread of a new idea from its identify links between the concepts of
source of invention or creation to its ultimate consumers as business partner and corporate
users or adopters”. [14] innovation strategies. An empirical study was
designed to evaluate the essences of
experience relating to consumer based
198
Proceedings of the 2008 IEMS Conference
innovation management and the perception of ages and 15 (21%) were between
this new trend between final consumers. A 36-45 ages.
representative sample of 100 people selected
randomly would be interviewed with a Table 3. Transmitting feed-back
questionnaire. This form is constituted 3 Freq
questions aimed at interviewing demographic uenc Valid
parameters, 12 questions aimed at y Percent Percent
interviewing their perception about Valid yes 52 71,2 71,2
corporation innovation experiences. But 73 of no 21 28,8 28,8
our sample answered us; that’s why 73 Total 73 100,0 100,0
transcripts of interviews are analyzed, coded in
SPSS program and the essences of experience We determined that 52 (71%) of our data
are collectively synthesized in the description. transmit their feed-backs about the products to
We used descriptive statistical techniques producers, but we also measured that 21 (29%)
evaluating our findings such as chi-square test, of our data don’t prefer to transmit their feed-
correlation and frequencies. We measured backs. These data mean that the majority of
demographic variables: gender, consumers are conscious and they feel
age and job. Our responsible themselves of the products and the
representative sample was services which they use.
constituted from 38 women (52%)
and 35 men (48%).
Table 4. Preference of channel
Freque Perce Valid
5. Findings ncy nt %
Valid cst
Table 1. Gender 6 8,2 8,2
rprsnttv
Freque Perce Valid call
12 16,4 16,4
ncy nt Percent centre
Valid W e-mail 10 13,7 13,7
38 52,1 52,1
M fax 1 1,4 1,4
35 47,9 47,9
Total 1-2 4 5,5 5,5
73 100,0 100,0
1-3 6 8,2 8,2
Table 2. Age 2-3 1 1,4 1,4
Freque Perce Valid 2-4 1 1,4 1,4
ncy nt Percent 3-4 1 1,4 1,4
Valid 18- 1-2-3 8 11,0 11,0
16 21,9 21,9 1-2-3-4 2 2,7 2,7
25
26- N/A 21 28,8 28,8
32 43,8 43,8 Total
35 73 100,0 100,0
36-
15 20,5 20,5
45
46- Our findings suggest that the majority of our
1 1,4 1,4
55 data (12 – 16%) prefer call centres and to send
56- e-mails (10 – 14%) to transmit their feed-
5 6,8 6,8
65 backs about the product and/or the services.
65- We also measured that 8 (11%) of our data
üzer 4 5,5 5,5 prefer to use customer representatives, call
i centres and sending e-mails to transmit their
Tota feed-backs. We may say that the customers
73 100,0 100,0
l look for the rapidity to communicate with
producers and e-mail is considered as one of
Our data suggests that 32 (44%) of the the most preferred transmitting channel.
sample were between 26-35 ages,
16 (22%) were between 18-25
199
Proceedings of the 2008 IEMS Conference
200
Proceedings of the 2008 IEMS Conference
201
Proceedings of the 2008 IEMS Conference
202
Proceedings of the 2008 IEMS Conference
[8] Archie B. Carroll, “The Four Faces of [12]Eric von Hippel, “Lead Users: A Source of
Corporate Citizenship”, Business and Society Novel Product Concepts,” Management Science,
Review, 100/101, 1998, pp. 1-7. July 1986, pp. 791-805. Also see his The Sources
[9]Archie B. Carroll, “The Pyramid of Corporate of Innovation (New York: Oxford University Press,
Social Responsibility: Toward the Moral 1988); and “Learning from Lead Users,” in
Management of Organizational Stakeholders”, Marketing in an Electronic Age, ed. Robert D.
Business Horizons, July-August, 1991, pp. 39-48. Buzzell (Cambridge, MA: Harvard Business
[10] New Products Management for the 1980s School Press, 1985), pp. 308-17.
(New York: Booz, Allen & Hamilton, 1982). [13]Cooper, Robert G. and Kleinschmidt, Elko J.,
[11]Kotler, P., Marketing Management Analysis, New Products: The Key Factors in Success,
Planning, Implementation and Control, American Marketing Association, Chicago, 1990.
International Edition, 9.ed., Upper Saddle River, [14]Rogers, Everett M, Diffusion of Innovations,
Prentice Hall, New Jersey, 1997. Free Press, NewYork, 1962.
203
Proceedings of the 2008 IEMS Conference
The survey’s target population included Sixty percent of students who completed
undergraduate and graduate students taking the survey were males. Over 95% of the
courses within the School of Business and students were between the ages of 18-24 and
Industry at Florida A&M University. These 99% of the students stated they were of
students were asked to complete a survey of Black/Non-Hispanic nationality. Over 98%
34 questions. Questions consisted of multiple of the individuals were students in the School
choice and short answer to assess factors of Business and Industry. 86% of the
related to perceptions and reasons for using students stated they used online social
two social online networking sites: networking tools such as Facebook and
Facebook and MySpace. Surveys were MySpace and that they used it to keep in
administered to 309 individuals within the touch, to network, to supply information and
following seven undergraduate and graduate for business purposes. Additional write in
courses: Introduction to Business Systems, answers included using the tools to view
Systems Theory and Design, Organizational pictures and for fun. Over half of our
Behavior, Organizational Theory and respondents stated they have been using
Behavior, Intermediate Accounting Part I, these tools for only 1-2 years. Although,
Intermediate Account Part II and over 50% of the students surveyed stated
Governmental Accounting. All students they use these tools at least once or twice a
were asked to complete the survey only once. day (several times a day 21%, once/twice a
The survey was divided into the following day 31%). In addition, results from the
three major areas: survey revealed that although most students
utilize Facebook and MySpace, only 40%
Section #1: Personal information stated they feel they it is somewhat
Questions (Q)1-4 important.
first, second and third, respectively for items person. Students were asked to select “Yes”,
that they liked least. Opportunity #2 relates “No”, or “Yes - only if they have a
to a survey question found within the third reasonable explanation for what they are
Section. This question asked students what looking at”. Over 35% of the students
type of data did they upload onto Facebook surveyed answered Yes, that is was okay for
and MySpace. Over 79% of the students strangers to comment and over 55%
stated they uploaded self-photos, while 54% answered Yes, that it was okay for
stated they uploaded their personal acquaintances to comment. However, over
information such as hometown and date of 19% of the students responded to the answer
birth and 51% stated they uploaded photos of Yes - only if they have a reasonable
their friends. Figure 1 shows the percentage explanation for what they are looking at for
of items uploaded to Facebook/MySpace ‘acquaintances’ and 17% of the students
responded to the answer Yes - only if they
Which Items Have you Uplo have a reasonable explanation for what they
are looking at for ‘strangers’. See Figure 2
0.8 can acquaintances comment on profile.
0.7
0.6
0.5
Perce 0.4
0.3
0.2
0.1
0
Self-Photos
Photos of
My Email
List of Current
Relationship
Personal Info
FriendsAddressClasses Status
Ite
Figure 2: Can Acquaintances Comment
Figure 1. Uploaded Items to on Profile
Facebook/MySpace
The survey also revealed interesting facts
Challenges #3 and #5 relates to several related to inappropriate sexual and racial
questions found in the third Section. material. 50% of the students stated that
Challenge #3 relates to a question, which Facebook and MySpace should be monitored
asked students if they ever decided not to for inappropriate racial material, while 72%
post photos of themselves on of the students stated that these online
Facebook/MySpace because it might be read services should be monitored for
by current or future employers, University inappropriate sexual material. Lastly 76%
administrators and/or teachers and parents. stated that they have seen inappropriate
Most students stated they did NOT desire to sexual material on these sites.
upload photos because they feared
current/future employers (61% stated Yes) 4. Conclusion
might read it more so than university
admin/teachers (43% stated Yes) and even There are several implications for utilizing
parents (36% stated Yes). A following social networking more effectively. It is a
question concerning posting information tool that corporations and millennials can
about them also revealed similar results. gain benefit. Corporations can use networks
Challenge #5 relates to two questions that to recruit and secure referrals for potential
discuss whether or not it is okay for either employees. Firms can also incorporate
strangers or acquaintances to comment on Facebook and MySpace applications on
their profile, when you see each other in computer workstations to encourage
207
Proceedings of the 2008 IEMS Conference
208
Proceedings of the 2008 IEMS Conference
Abstract
1.1. Predictive Scheduling
This article describes recent advances in Predictive or proactive scheduling
both predictive scheduling (developing the techniques allow a project manager to
best schedule before work begins that will be develop a baseline schedule that protects the
the most robust to uncertainty) and reactive project against the effects of uncertainty.
heuristics (rescheduling procedures that a Analytical approaches and techniques for
project manager may use in real-time as developing robust proactive schedules have
uncertainty occurs) for the Stochastic been extensively studied ([2], [17], [27],
Resource Constrained Project Scheduling [45]). Approaches include redundancy-based
Problem (SRCPSP). techniques that allow for extra time in each
activity, or scheduling extra resources. Other
1. Introduction: Predictive and
approaches include robust machine
Reactive Scheduling
scheduling techniques to develop schedules
Stochastic project scheduling is to minimize the consequences of the worst
generally concerned with developing an case scenario.
order of tasks within a project that is robust
to uncertainty in areas such as resource 1.2. Predictive Scheduling – Buffering:
availability, resource breakdown, or task Goldratt’s [11] Critical Chain and
durations. A robust schedule is "flexible" Buffer Management Method provides a
such that metrics of performance (most heuristic framework for project managers to
commonly project duration) remain as close plan, schedule, and control their projects, but
as possible to optimal when disturbances leaves it up to the user to determine exact
occurrence [32]. To account for uncertainty implementation. The method identifies the
in a project schedule, project managers may critical chain, or the set of tasks that results
use two approaches: a predictive (or in the longest path to project completion,
proactive) scheduling technique to develop a considering both precedence and resource
baseline schedule before work begins or a constraints. A “feeding” buffer is placed at
reactive scheduling technique that provides the points in the schedule where resources
decision points once work begins to react or feed into the critical chain path, to protect the
reschedule in light of the outcome of critical chain, while a “project” buffer
uncertain events. Managers may apply a protects the overall project completion time
combination of both of these techniques [12].
during the course of a project. Tavares, Ferreira, et al.[36] have applied
the concept of using buffers to protect the
schedule by adding time buffers throughout a
209
Proceedings of the 2008 IEMS Conference
210
Proceedings of the 2008 IEMS Conference
The same authors have also used a resource sampling, that uses a Schedule Generating
flow network resource allocation model that Scheme by constructing different activity
protects a given baseline schedule against lists and a priority rule to calculate its
activity duration variability [24]. possibility of being selected. The second is
Stochastic task duration can be based on Hartmann’s [14] genetic algorithm
approached as a multi-stage decision process and demonstrates the adaptability of this
that uses so-called “scheduling policies” method to stochastic RCPSP. Ballestin and
[34]. Scheduling policies have also been Leus’s [5] work continued in the
employed to minimize the expected project development of a Greedy Randomized
duration by developing multi-stage stochastic Adaptive Search Procedure heuristic to
programming problems ([9], [10], [28]). develop baseline schedules for minimizing
Branch and bound methods have also been expected project makespan and they
applied [19], [34], [35]. evaluated the heuristic by studying the
Al-Fawzan and Haouari [1] provided a distribution of the possible makespan.
tabu search algorithm to generate a baseline Deblaere, Demeulemeester, et al. [7]
schedule for the resource constrained provided three integer programming-based
problem with stochastic task durations to heuristics and one constructive procedure to
minimize makespan while maximizing develop a baseline schedule and resource
robustness. However, Kobylanski and allocation plan for the RCPSP with activity
Kuchta [22] questioned the methods in a note duration variability to maximize stability.
on their paper and provided their own Other approaches include that of Long
definitions and methods of determining a and Ohsato [26] who provided a genetic
robust schedule. They suggested a fixed algorithm for the RCPSP with stochastic
makespan for the project and then activity durations so optimal project duration
determining a robust schedule for this fixed is minimized. Shih [33] proposed a greedy
makespan. Their proposed robustness is method for resource allocation to reduce
measured by minimal free slack or the completion time for the RCPSP with
minimal ratio of free slack to activity stochastic task duration.
duration as high as possible. Azaron,
Perkgoz, et al. [3] evaluated the time-cost 2.2. SRCPSP Approach: Critical Path
trade off with a multi-objective genetic and Buffering Methods
algorithm that allocates resources as the Grey [12] provided a detailed overview
decision variable while optimizing project of Critical Chain Buffer Management and its
cost, mean completion time, and the application for the RCPSP, and included a
probability that the project exceeds a time discussion of critiques of the method. One
threshold. concern is that the technique generates a
Ballestin [4] evaluated when heuristic baseline schedule by solving the
algorithms should be used to solve problems deterministic RCPSP and subsequently
with stochastic task durations instead of inserts buffers for robustness, instead of
working with the deterministic problem and solving a stochastic RCPSP [15], which may
provided heuristics to develop the baseline be an oversimplification of the problem [18].
schedule when activity durations are given Most of the research related to the use of
by a known distribution. The first algorithm buffers for project scheduling is in the job
is based on regret-based biased random shop or machine scheduling literature and
211
Proceedings of the 2008 IEMS Conference
includes time and resource redundancy [12]. events, and described the various reactive
Rabbani, Fatemi Ghomi, et al. [29] strategies a project manager may take. On
developed a new heuristic for the RCPSP one hand, a reactive effort may be a very
with stochastic task durations by using simple schedule repair such as the right shift
concepts from critical chain to determine the rule [30] which will move forward in time all
finish time of activities at certain decision activities affected and does not re-sequence
points. Additionally, Van de Vonder, activities. More complex techniques
Demeulemeester, et al. [44] proposed a suggested by the authors include a full
heuristic algorithm to protect the starting rescheduling of the remaining portion of the
times of intermediate activities when schedule. This full rescheduling action may
multiple activity disruptions occur by adding be completed with a different set of
intermediate buffers to a minimal duration objectives than those used to construct the
RCPSP. original baseline schedule because an
Kim and De La Garza [21] proposed additional objective may include the
and evaluated the resource constrained minimization of deviations from the baseline
critical path method (RCPM), that allows for schedule. The authors refer to this re-
the identification of alternative schedules by scheduling case with new objectives as "ex
identifying the critical path and float data. post stability" measures of performance.
Liu, Song, et al. [25] proposed a multi-
objective model for the RCPSP and use a
[1] Al-Fawzan, M.A. and M. Haouari, A bi-
critical chain scheduling approach to insert objective model for robust resource-constrained
feeding and project buffers into the project scheduling. International Journal of
approximate optimal project schedule to Production Economics, 2005. 96(2): p. 175.
enhance stability. Grey [12] suggested the [2] Aytug, H., M.A. Lawley, K. McKay, S.
possibility of using feeding and project Mohan, and R. Uzsoy, Executing production
buffers for the Stochastic Task Insertion schedules in the face of uncertainties: A review
and some future directions. European Journal of
(STI) case as an area of future research.
Operational Research, 2005. 161: p. 86-110.
Tukel, et al. [38] have introduced two
new buffering methods for determining [3] Azaron, A., C. Perkgoz, and M. Sakawa, A
feeding buffer sizes in critical chain project genetic algorithm approach for the time-cost
trade-off in PERT networks. Applied
scheduling for the SRCPSP with stochastic Mathematics and Computation (New York),
task durations. One method uses resource 2005. 168(2): p. 1317-1339.
utilization and the other uses network
[4] Ballestín, F., When it is worthwhile to work
complexity. with the stochastic RCPSP? Journal of
Scheduling, 2007. 10(3): p. 153.
2.3. Reactive Scheduling for the SRCPSP
[5] Ballestin, F. and R. Leus, Resource-
Van de Vonder, Demeulemeester, et al. constrained project scheduling for timely project
[39, 42] evaluated several predictive-reactive completion with stochastic activity durations.
2007.
SRCPSP procedures with the objective of
maximizing stability and the probability of [6] Davenport, A.J. and J.C. Beck. A survey of
timely project completion. Herroelen and techniques for scheduling with uncertainty.
Unpublished manuscript. Available from:.
Leus [17] described predictive-reactive
http://www.eil.utoronto.ca/profiles/chris/chris.pap
scheduling techniques as those that repair a ers.html 2002
baseline schedule to account for unexpected
212
Proceedings of the 2008 IEMS Conference
[15] Herroelen, W. and R. Leus, On the merits [26] Long, L.D. and A. Ohsato, Solving the
and pitfalls of critical chain scheduling. Journal resource-constrained project scheduling problem
of Operations Management, 2001. 19(5): p. 559. by genetic algorithm. Journal of Japan Industrial
Management Association, 2007. 57(6): p. 520-
[16] Herroelen, W. and R. Leus, The 529.
construction of stable project baseline schedules.
European Journal of Operational Research, [27] Mehta, S.V. and R.M. Uzsoy, Predictable
2004a. 156(3): p. 550-565. scheduling of a job shop subject to breakdowns.
IEEE Transactions on Robotics and Automation,
[17] Herroelen, W. and R. Leus, Project 1998. 14(3): p. 365-378.
scheduling under uncertainty: Survey and
research potentials. European Journal of [28] Pet-Edwards, J., B. Selim, R.L. Armacost,
Operational Research, 2005a. 165(2): p. 289. and A. Fernandez. Minimizing risk in stochastic
resource-constrained project scheduling. 1998.
[18] Herroelen, W., R. Leus, and E. Paper presented at the INFORMS Fall Meeting,
Demeulemeester, Critical chain project Seattle, October 25-28.
213
Proceedings of the 2008 IEMS Conference
[29] Rabbani, M., S.M.T. Fatemi Ghomi, F. [39] Van de Vonder, S., E. Demeulemeester, and
Jolai, and N.S. Lahiji, A new heuristic for W. Herroelen, An Investigation of Efficient and
resource-constrained project scheduling in Effective Predictive-reactive Project Scheduling
stochastic networks using critical chain concept. Procedures. 2004: Katholieke Universiteit
European Journal of Operational Research, 2007. Leuven, Faculty of Economics and Applied
176(2): p. 794-808. Economics, Department of Applied Economics.
[30] Sadeh, N., S. Otsuka, and R. Schedlback, [40] Van de Vonder, S., E. Demeulemeester, and
Predictive and reactive scheduling with the W. Herroelen, Heuristic Procedures for
microboss production scheduling and control Generating Stable Project Baseline Schedules.
system. In: Proceedings of the IJCAI-93 2005a: Katholieke Universiteit Leuven, Faculty
Workshop on Knowledge-Based Production of Economics and Applied Economics,
Planning, Scheduling and Control, pp. 293-306. Department of Applied Economics.
1993.
[41] Van de Vonder, S., E. Demeulemeester, and
[31] Sakka, Z.I. and S.M. El-Sayegh, Float W. Herroelen, Proactive heuristic procedures for
consumption impact on cost and schedule in the robust project scheduling: An experimental
construction industry. Journal of Construction analysis. European Journal of Operational
Engineering and Management, 2007. 133(2): p. Research, 2006a. In Press, Corrected Proof.
124-130.
[42] Van de Vonder, S., E. Demeulemeester, and
[32] Selim, B.R., Robustness measures for W. Herroelen, A classification of predictive-
stochastic resource constrained project reactive project scheduling procedures. Journal
scheduling. 2002, University of Central Florida: of Scheduling, 2007b. 10(3): p. 195.
United States -- Florida.
[43] Van de Vonder, S., E. Demeulemeester, W.
[33] Shih, N.-H., On the project scheduling Herroelen, and R. Leus, The use of buffers in
under resource constraints. Journal of the project management: The trade-off between
Chinese Institute of Industrial Engineers, 2006. stability and makespan. International Journal of
23(6): p. 494-500. Production Economics, 2005b. 97(2): p. 227.
[34] Stork, F., Branch-and-bound algorithms for [44] Van de Vonder, S., E. Demeulemeester, W.
stochastic resource-constrained project Herroelen, and R. Leus, The trade-off between
scheduling, Research Report No. 702/2000. stability and makespan in resource-constrained
Technische Universität Berlin. 2000. project scheduling. International Journal of
Production Research, 2006b. 44(2): p. 215.
[35] Stork, F., Stochastic resource-constrained
project scheduling. Ph.D. Thesis. Technische [45] Wu, S.D., R.H. Storer, and P.-C. Chang,
Universität Berlin. 2001. One-Machine Rescheduling Heuristics with
Efficiency and Stability as Criteria. Computers &
[36] Tavares, L.V., J.A.A. Ferreira, and J.S. Operations Research, 1993. 20(1): p. 1.
Coelho, On the optimal management or project
risk. European Journal of Operational Research, [46] Zhu, G., J.F. Bard, and G. Yu, A two-stage
1998. 107(2): p. 451. stochastic programming approach for project
planning with uncertain activity durations.
[37] Trietsch, D., Optimal feeding buffers for Journal of Scheduling, 2007. 10(3): p. 167-180.
projects or batch supply chains by an exact
generalization of the newsvendor result. Authors
International Journal of Production Research,
2006. 44(4): p. 627-637.
Sandra Archer, M.S. is a Ph.D.
[38] Tukel, O.I., W.O. Rom, and S.D. Eksioglu, candidate and Robert L. Armacost, D.Sc. and
An investigation of buffer sizing techniques in Julia Pet-Armacost, Ph.D. are Associate
critical chain scheduling. European Journal of
Operational Research, 2006. 172(2): p. 401-416. Professors in the Department of Industrial
Engineering and Management Systems at the
University of Central Florida.
214
Proceedings of the 2008 IEMS Conference
215
Proceedings of the 2008 IEMS Conference
pieces would fly outward and possibly injure inch tall C-section steel. The C-section steel was
someone. After these dismissals a final design used to save weight over a solid bar while not
was found. A vacuum would be pulled from greatly sacrificing strength. A 21 inch by 21
underneath the panel to reduce the chance of inch steel plate was used for the bottom portion
pieces flying outward. Strain gauges would be of the frame and was welded to the walls of the
placed on the bottom surface of the panel to frame.
measure strain and an LVDT would be mounted
above to measure the deflection of the center of
the panel. Using a vacuum pump, the vacuum
could be slowly applied to the panel to form the
static load. This final design idea was chosen
based on safety and since it met all of the
constraints. The preliminary design is shown in
Figure 1.
216
Proceedings of the 2008 IEMS Conference
ure 3. Gussets
Three holes of diameters two with ¾ inch and a
single ½ inch would be drilled into the wall for
instrumentation installation. One is used to
connect the vacuum pump to the rig. Another Figure 5. Flange
port measures the pressure inside the vessel.
These instruments can be seen in Figure 4. The size and grade had to be increased once tests
were conducted. The original bolts were number
10 fine thread, but failed once pressure was
exerted onto the plate. This failure is likely due
to the shear stress in the bolts exceeding the
shear strength of the material. The bolt size was
increased to ¼ in. Even after the increase of size
the bolts still failed, so the bolt size and grade
was increased to 5/16 and grade eight,
respectively. Another test was conducted and
during which the bolts did not fail.
4. Manufacturing
Figure 4. Instrumentations
The C3x5.5 steel bar was cut into four
pieces that were 20.82 inches long with a 45
The third hole will allow the wires from
the gauges to feed from the bottom of the panels degree cut at the ends. These wall sections were
to the computer. To ensure a vacuum a rubber welded together using a double v-groove weld to
ensure air tightness. The wall was welded to the
stopper is used to seal the hole. The strain
gauges will be placed on the under side of the plate and using a fillet weld. All welds were
cleaned up for appearance. The top of the walls
panel. A flange was designed to hold the test
panel securely in place and to help with being air were milled down to ensure a clean, level mating
217
Proceedings of the 2008 IEMS Conference
218
Proceedings of the 2008 IEMS Conference
219
Proceedings of the 2008 IEMS Conference
LuAnn Bean
Florida Institute of Technology
Lbean@fit.edu
Jeff Manzi
Kaplan Financial Education
Jeff.Manzi@kaplan.com
220
Proceedings of the 2008 IEMS Conference
221
Proceedings of the 2008 IEMS Conference
it [7,13]. Therefore, a key part of this that experts (or more knowledgeable
step is to schedule sufficient time auditors) will surpass novices (or less
before convening a session so that all experienced auditors) due to their
participants can complete abilities to recognize patterns [10].
brainstorming “homework.” This Therefore, facilitators should draw on
should include a review of the most the knowledge of auditors with the
current financial information, a review most extensive experience and memory
of prior year or interim financial of client history for these answers
statements, and a review of previous during this step of the model.
management letters and other 4. Explore conflicts and uncouple
documents. Many experts agree that oxymorons. This step promotes the
this methodical but somewhat slow dissection of new ideas, which may at
start may lead to long-run time saving face value seem ridiculous or
benefits because fewer wrong ideas contradictory. Discussion should be
will be generated and greater problem focused on questions about what client-
solving effectiveness results. specific controls, personnel,
2. Use direct analogies to generate ideas. environmental, or risk-centric
Analogies can lead auditors to notice situations would or would not allow
parallels or interconnected fraud or generated scenarios to exist. In the
material misstatement situations—those process, the audit team should address
that might be occurring in the current significant areas of audit risk, areas
client’s organization but are based on subject to management’s override of
knowledge of similar incidences in controls, unusual accounting
previous clients or cases. An important procedures used by the client, and
checklist to incorporate during this step important control systems.
of the model is actually contained in 5. Develop any new analogies. Through
Appendix C of SAS No. 109. Step 4 exploration, new analogies may
Facilitators can utilize the factors listed surface. During this step explore what
in this standard stimulate discussion additional factors, risks, or red flags of
about similar situations in the current fraud were identified that made
audit, since this list was collectively previous ideas feasible, implausible, or
developed as a summary of business partially possible.
risks found in past audit failures or red 6. Reexamine the original client situation.
flags noted by the ASB. However, Both SAS No. 99 and SAS No. l09
researchers caution that this approach suggest that resulting ideas should be
might lead to a functional fixedness of help determine the extent of testing for
the mental set at this point in the fraud and the establishment of
process. In other words, more materiality at both the financial
knowledgeable auditors (or experts) statement and account levels. As a
may focus too closely on similar audits result, final financial statement
and as a result generate fewer ideas scenarios that identified overall audit
than less experienced auditors about risks may require procedures and
potential misstatements or fraud actions performed by a more
situations due to their biases, experienced audit team, while
preconceptions, and overconfidence assertion-based risks at the account
[11,12]. level may target specific, directed
3. React to the analogies. How is this procedures (like the analysis of product
client’s company similar or dissimilar demand and inventory turnover used to
to analogized cases of fraud or material explore inventory obsolescence).
misstatement? Here, research indicates
222
Proceedings of the 2008 IEMS Conference
223
Proceedings of the 2008 IEMS Conference
in the Psychology of Human Intelligence (Vol. 1, pp. Research and Empirical AI (pp. 191-203). New
7-76), Hillsdale, N.J.: Erlbaum, 1982. York: Springer Verlag, 1992.
[8] Gordon, W. J. J., Synectics, New York: Harper & [13] J. F. Voss, T. R. Greene, T. Post, and B. C.
Row, 1961. Penner, “Problem Solving Skill in the Social
Sciences,” In G. Bower (Ed.), The Psychology of
[9] A. A. Hayes, “Sneaking a Peek: How the New Learning and Motivation, pp. 165-213. New York:
Auditing Standards on Risk Assessment May Affect Academic Press, 1983.
Fraud Huddles.” The Journal of Government
Financial Management, 2006, 55 (3), pp. 68-70. [14] R. E. Wortmann and A. Henry, “Don’t Get Stuck
Behind the Eight Ball: Master New Auditing
[10] A. Lesgold, H. Rubinson, P. Feltovich, R. Standards Now,” Pennsylvania CPA Journal, 2007,
Glaser, D. Klopfer, and Y.Wang, “Expertise in 77 (4), pp. 28-35.
Complex Skill: Diagnosing X-Ray Pictures,” In
M.T.H. Chi, R. Glaser, M.J. Farr (Eds.), The Nature Authors
of Expertise (pp. 311 – 342), Hillsdale, N.J.:
Erlbaum, 1988. LuAnn Bean, Ph.D., CPA, CIA, CFE, FCPA, is
Professor of Accounting at Florida Institute of
[11] R. J. Sternberg, “Costs of Expertise,” In K. A.
Technology in Melbourne, Florida.
Ericsson (Ed.), The Road To Excellence: The
Acquisition of Expert Performance in the Arts and
Jeffrey Manzi, Ph.D., CFA, is Vice President of
Sciences, Sports, and Games (pp. 347 – 354),
Hillsdale, N.J.: Erlbaum, 1996.
Securities and Insurance for Kaplan Financial
Education in LaCrosse, Wisconsin.
[12] R. J. Sternberg and P. A. Frensch, “On Being an
Expert: A Cost-Benefit Analysis,” In R. R. Hoffman
(Ed.), The Psychology of Expertise: Cognitive
224
INDEX
Author Page Author Page