You are on page 1of 140

Simulation-Based Bucket Elevator Monitoring System

Basco, John Benedict B.

Gomez, Jude Arthur King D.

Guerra, Denise Mia B.

Par, Joshua P.

Pelobillo, Alyanna Joyce M.

Technological Institute of the Philippines

Manila

December 2020
Acceptance Sheet

The proposed design project entitled “Simulation-Based Bucket Elevator Monitoring System” has been
prepared and submitted by the proponents:

Basco, John Benedict B.

Gomez, Jude Arthur King D.

Guerra, Denise Mia B.

Par, Joshua P.

Pelobillo, Alyanna Joyce M.

After thorough reviews and evaluations, the committee has accepted the proposed design project based on
the required criteria. All information presented during the proposal defense is valid. Accepted this January
27, 2021, 1st Semester, S.Y. 2020-2021.

_____________________ ___________________________ _________________________


Engr. Christine Joy Arce Engr. Marjorie B. Villanueva Engr. Cayetano D. Hiwatig

Panel Member Panel Member Panel Member

_______________________

Engr. Gerhard Tan

Project Adviser

_______________________

Engr. Roberto C. Dela Cruz

ECE Program Chair

_______________________

Engr. Marianne L. Yumul

Dean, College of Engineering and Architecture


Approval Sheet

The proposed design project entitled “Simulation-Based Bucket Elevator Monitoring System” has been
prepared and submitted by the proponents:

Basco, John Benedict B.

Gomez, Jude Arthur King D.

Guerra, Denise Mia B.

Par, Joshua P.

Pelobillo, Alyanna Joyce M.

Was examined and evaluated by the members of the Design Project Evaluation Committee, is now
recommended for approval.

_____________________ ___________________________ _________________________


Engr. Christine Joy Arce Engr. Marjorie B. Villanueva Engr. Cayetano D. Hiwatig

Panel Member Panel Member Panel Member

_______________________

Engr. Gerhard Tan

Project Adviser

_______________________

Engr. Roberto C. Dela Cruz

ECE Program Chair

_______________________

Engr. Marianne L. Yumul

Dean, College of Engineering and Architecture


Table of Contents
CHAPTER 1 – Project Background..................................................................................................................5
The Project....................................................................................................................................................5
Project Objectives.........................................................................................................................................8
The Client......................................................................................................................................................8
Project Scope................................................................................................................................................8
Project Limitations.........................................................................................................................................8
Project Development.....................................................................................................................................9
CHAPTER 2 – Design Inputs..........................................................................................................................11
Client Requirements....................................................................................................................................11
Design Criteria and Constraints..................................................................................................................11
Response Time.......................................................................................................................................12
Reliability.................................................................................................................................................12
Projected Cost.........................................................................................................................................12
Comparison of Related Literature...............................................................................................................13
International Papers................................................................................................................................13
Local Papers...........................................................................................................................................15
Design Specifications..................................................................................................................................17
CHAPTER 3 – Project Design........................................................................................................................18
DESIGN OPTION 1 Using Infrared Tachometer and Infrared Sensor.......................................................19
Efficiency.................................................................................................................................................28
Performance............................................................................................................................................28
Economic................................................................................................................................................28
Critical Path Method................................................................................................................................30
DESIGN OPTION 2 Using Embedded Tachometer and Ultrasonic Sensor..............................................33
Efficiency.................................................................................................................................................42
Performance............................................................................................................................................42
Economic................................................................................................................................................43
Critical Path Method................................................................................................................................44
DESIGN OPTION 3 Using Image Processing............................................................................................47
Efficiency.................................................................................................................................................52
Performance............................................................................................................................................52
Economic................................................................................................................................................53
Critical Path Method................................................................................................................................54
CHAPTER 4 – Design Constraints, Trade-offs, and Standards.....................................................................57
Constraints..................................................................................................................................................57
Trade-offs....................................................................................................................................................58
Sensitivity Analysis......................................................................................................................................58
Pareto Optimization.....................................................................................................................................67
Efficiency and Performance....................................................................................................................67
Performance and Economic Criteria.......................................................................................................68
Economic Scale and Efficiency Criteria..................................................................................................71
Chapter 5 – Final Design................................................................................................................................77
Winning Design...........................................................................................................................................78
Components................................................................................................................................................79
Testing.........................................................................................................................................................85
Methods of Procedure.................................................................................................................................85
Simulation Process.....................................................................................................................................85
Computation for the Performance of the device based on Efficiency:.......................................................87
Interpretation of Data..................................................................................................................................87
Summary of Findings..................................................................................................................................87
Conclusion..................................................................................................................................................87
Recommendation........................................................................................................................................88
References..................................................................................................................................................89
APPENDICES.................................................................................................................................................91
Appendix A. Images of Simulated Bucket Elevator in Solidworks (Scaled used in Design Option 3).......91
Appendix B. Algorithm Testing...................................................................................................................92
Appendix C. Sensitivity Analysis.................................................................................................................99
Appendix D. Datasheet.............................................................................................................................124

List of Figures
Figure 1-1 Bucket grain elevator used by SIDC.............................................................................................10
Figure 1-2 Simulated Design of SIDC’s Bucket Elevator...............................................................................11
Figure 1-3 Feed bags produced by SIDC from 2015-2019............................................................................11
Figure 1-4 System Architecture for Project Development..............................................................................13
Figure 2-1 National Instruments LabVIEW.....................................................................................................21
Figure 2-2 Dassault Systèmes SOLIDWORKS..............................................................................................21
Figure 3-1 Tree Diagram for Design Options.................................................................................................22
Figure 3-2 NI LabView Front Panel for Design Option 1................................................................................23
Figure 3-3 Design Option 1 Simulation Process.............................................................................................24
Figure 3-4 NI LabView Block Diagram for Design Option 1...........................................................................26
Figure 3-5 Power On and OFF logging...........................................................................................................26
Figure 3-6 Operation Time Computation (runtime).........................................................................................26
Figure 3-7 Date and Time Grabber.................................................................................................................27
Figure 3-8 Error Odds.....................................................................................................................................27
Figure 3-9 Value Handlers..............................................................................................................................28
Figure 3-10 Speedometer...............................................................................................................................28
Figure 3-12 Default Speed of the Bucket Elevator.........................................................................................29
Figure 3-13 CSV File Maker...........................................................................................................................29
Figure 3-14 Volumetric Throughput................................................................................................................30
Figure 3-15 Front Panel Essential Operators.................................................................................................30
Figure 3-16 Infrared sensors...........................................................................................................................31
Figure 3-17 Excel file of Alert Log...................................................................................................................31
Figure 3-18 Network Activity for Design Option 1...........................................................................................34
Figure 3-19 Critical Path Network Diagram for Design Option 1’s entire schedule of activities....................35
Figure 3-20 NI LabView Front Panel for Design Option 2..............................................................................37
Figure 3-21 Design Option 2 Simulation Process...........................................................................................38
Figure 3-22 NI LabView Block Diagram for Design Option 2.........................................................................40
Figure 3-23 Power On and Off Logging..........................................................................................................40
Figure 3-24 Operation Time Computation......................................................................................................40
Figure 3-25 Date and Time Grabber...............................................................................................................41
Figure 3-26 Error Odds...................................................................................................................................41
Figure 3-27 Value Handlers............................................................................................................................42
Figure 3-28 Speedometer...............................................................................................................................43
Figure 3-29 Critical Error Inducer....................................................................................................................43
Figure 3-30 Default Values.............................................................................................................................43
Figure 3-31 CSV File Maker...........................................................................................................................44
Figure 3-32 Volumetric Throughput Computation..........................................................................................45
Figure 3-33 Front Panel Logic........................................................................................................................45
Figure 3-34 Symphony of Embedded Tachometer and Ultrasonic Sensors..................................................46
Figure 3-35  Excel file of Alert Log..................................................................................................................47
Figure 3-36 Network Activity for Design Option 2...........................................................................................49
Figure 3-37 Critical Path Network Diagram for Design Option 2’s entire schedule of activities....................50
Figure 3-38 NI LabView Front Panel for Design Option 3..............................................................................52
Figure 3-39 Design Option 3 Simulation Process...........................................................................................53
Figure 3-40 NI LabView Block Diagram for Design Option 3.........................................................................53
Figure 3-41 CSV File Maker...........................................................................................................................54
Figure 3-42 Volumetric Throughput Computation..........................................................................................54
Figure 3-43 Value Handlers............................................................................................................................55
Figure 3-44 Default and Basis........................................................................................................................55
Figure 3-45 Date and Time Grabber...............................................................................................................55
Figure 3-47 Belt Speed Image Processing.....................................................................................................56
Figure 3-48 Boot Operations...........................................................................................................................56
Figure 3-49 Network Activity for Design Option 3...........................................................................................59
Figure 3-50 Critical Path Network Diagram for Design Option 3’s entire schedule of activities....................60
Figure 4-1 Efficiency versus Performance Design Criterion Space...............................................................73
Figure 4-2 Performance versus Economic Design Criterion Space...............................................................74
Figure 4-3 Performance and Economic Scale Criteria Decision Space.........................................................75
Figure 4-4 Efficiency and Economic Criteria Decision Space........................................................................77
Figure 5-1 Design Option 3 using Image Processing Front Panel as a Winning Design Option...................83
Figure 5-2 Design Option 3 using Image Processing Block Diagram............................................................84
Figure 5-3 Single Mode Power Supply 24V, 12V, 5V (150W)........................................................................84
Figure 5-4 Coaxial Cable................................................................................................................................84
Figure 5-5 Resistor..........................................................................................................................................85
Figure 5-6 Capacitor.......................................................................................................................................85
Figure 5-7 Crystal Unit....................................................................................................................................86
Figure 5-8 Microcontroller Unit (Raw Chip)....................................................................................................86
Figure 5-9 Transistor-Transistor Logic Converter...........................................................................................87
Figure 5-10 CCTV or Webcam.......................................................................................................................87
Figure 5-11 High Torque Motor with Gearbox Assembly...............................................................................88
Figure 5-12 Motor Driver.................................................................................................................................88
Figure 5-13 Aluminum Sheet (5mmx1000mmx1000mm)...............................................................................89
Figure 5-14 Aluminum Sheet (3mmx1000mmx1000mm)...............................................................................89
Figure 5-15 5m USB A to B Cable..................................................................................................................90
Figure Appendix-1 Simulated Bucket Elevator...............................................................................................96
List of Tables

Table 2-1 Design Criteria and Constraints......................................................................................................17


Table 2-2. Related Research Studies.............................................................................................................19
Table 3-1 Response Time of Design Option 1...............................................................................................34
Table 3-2 Reliability of Design Option 1.........................................................................................................34
Table 3-3 Economic Feasibility of Design Option 1........................................................................................34
Table 3-4 Design Option 1 Project Activities..................................................................................................36
Table 3-5 Critical Paths of Design Option 1....................................................................................................37
Table 3-6 Total Time Duration of Design Option 1.........................................................................................38
Table 3-7 Response Time of Design Option 2...............................................................................................49
Table 3-8 Reliability of Design Option 2.........................................................................................................49
Table 3-9 Economic Feasibility of Design Option 2........................................................................................49
Table 3-10 Design Option 2 Project Activities................................................................................................51
Table 3-11 Critical Paths of Design Option 2.................................................................................................52
Table 3-12 Total Time Duration of Design Option 2.......................................................................................53
Table 3-13 Response Time of Design Option 3.............................................................................................59
Table 3-14 Reliability of Design Option 3.......................................................................................................59
Table 3-15 Economic Feasibility of Design Option 3......................................................................................60
Table 3-16 Design Option 3 Project Activities................................................................................................61
Table 3-17 Critical Paths of Design Option 3.................................................................................................63
Table 3-18 Total Time Duration of Design Option 3.......................................................................................63
Table 4-1 Design Criteria using Sensitivity Analysis......................................................................................65
Table 4-2 Design Criteria using Sensitivity Analysis with Efficiency criterion having the highest level of
importance value.............................................................................................................................................66
Table 4-3 Design Criteria using Sensitivity Analysis with Performance criterion having the highest level of
importance value.............................................................................................................................................67
Table 4-4 Design Criteria using Sensitivity Analysis with Economic Scale criterion with the highest level of
importance value.............................................................................................................................................67
Table 4-5 Design Criteria using Sensitivity Analysis with all criteria having the highest level of importance
value................................................................................................................................................................68
Table 4-6 Design Criteria using Sensitivity Analysis with Efficiency criterion having with the highest level of
importance value.............................................................................................................................................68
Table 4-7 Design Criteria using Sensitivity Analysis with Performance criterion with the highest level of
importance value.............................................................................................................................................69
Table 4-8 Design Criteria using Sensitivity Analysis with Economic Scale criterion having the highest level
of importance value.........................................................................................................................................70
Table 4-9 Design Criteria using Sensitivity Analysis with all criteria having the highest level of importance
value................................................................................................................................................................70
Table 4-10 Design Criteria using Sensitivity Analysis with Efficiency criterion with the highest level of
importance value.............................................................................................................................................71
Table 4-11 Design Criteria using Sensitivity Analysis with Performance criterion with highest level of
importance value.............................................................................................................................................72
Table 4-12 Design Criteria using Sensitivity Analysis with Economic Scale criterion with highest level of
importance value.............................................................................................................................................72
Table 4-13 Summarized data for Efficiency and Performance.......................................................................74
Table 4-14 Summarized Data for Performance and Economic......................................................................75
Table 4-16 Coordinates of the Economic Scale and Performance................................................................77
Table 4-17 Summarized Data for Economic Scale and Efficiency.................................................................78
Table 4-18 Coordinates of the Economic Scale and Efficiency......................................................................80
Table 4-19 Summary of Pareto Optimization of the Design Option...............................................................81
Table 5-1 Performance of the Device.............................................................................................................93
CHAPTER 1 – Project Background

The Project

Next to rice, Zea Mays, or corn, is the most important crop in the Philippines. Some rural areas in the
country use white corn as a substitute for rice during shortages, while 90% of the yellow corn produced is
for livestock and poultry diet.
 
In a 2019 study shared by the United States Department of Agriculture, corn production will increase to 8.1
MMT in the market year of 2019-2020. Due to the growing market of feeds, the USDA expects the demand
for corn feeds to increase to 6.7 MMT as the Filipino diet sways away from consuming corn.
 
In an article published by the Philippine Council for Agriculture, Aquatic, and Natural Resources Research
and Development, yellow corn is an essential part of feed milling and animal nutrition. It accounts for 50-
60% of the total mixed feed ration. Furthermore, yellow corn is a primary supplement source and can
contribute up to 30% protein, 60% energy, and 90% starch in an animal's diet (Dado, 1999).
 
Various businesses have ventured into livestock feed manufacturing to meet the growing demand for corn
feeds. The most recent data shows that there are 545 registered feed mills in the Philippines, where 94% of
them are small manufacturers who produce 50 metric tons of feeds per day.
 
As explained by Plant Engineer Reymond T. Gonzales, Sorosoro Ibaba Development Cooperative (SIDC)-
Feed Mill Division receives corn grains from partners and begins the process of their feeds by separating
the grains and stores in a silo that has a capacity of 7000 metric tons. Before marketing the animal feeds, a
bucket elevator will transfer the corn grains from the silo in their processing plant to be ready for delivery.
 
According to SIDC's Engr. Trini Friend Ramos, their feed mill, runs three eight-hour shifts from Monday to
Saturday, while Sunday is for maintaining the machinery. Each shift produces an average of 3000 bags or
15000 kilos of animal feeds per day. The bags distributed weigh 50 kilos each and worth 1000 pesos when
sold to their distributors.
 
Their bucket elevator's capacity transports 60 tons of corn grains per hour, yielding about 400 bags of feed
after an hour. Also, SIDC revealed that their daily quota is 9000 bags, but Engr. Ramos reported that a
frequent malfunction that they encounter is the clogging of the bucket elevator. Clogging occurs when corn
grains fall off the buckets to the boot of the elevator.
 
Due to the clogging that occurs once or twice every other day, the feed mill operators devote two hours to
unclog the machine before resuming the operation. Thus, SIDC fails to meet their quota and produces only
5000 bags per day depending on the malfunction's frequency. The remaining target for that day moves to
the next daily quota, or the operators work on the mill during Sunday.
Figure 1-1 Bucket grain elevator used by SIDC
Figure 1-2 Simulated Design of SIDC’s Bucket Elevator

The clogging at the boot causes significant issues to the bucket elevator. The bucket elevator motor forces
the 33-meter conveyor belt to slip with errors accumulating, stopping the production for over ten hours.
SIDC has to disassemble the machine and reconnect the conveyor belt. The pause in production deducts
around 2.3 million pesos worth of feeds produced.

Figure 1-3 Feed bags produced by SIDC from 2015-2019

In 2015, SIDC had produced 2.4 million bags of corn feeds, which amounted to 2.4 billion pesos worth of
sales, and as their company grows, their feed mills have made 3.2 million bags by the end of 2019.
However, SIDC still has not found any solution to reduce their bucket elevators' downtime when it
experiences clogging due to the excess corn grains falling. With the growing business, the bucket
elevator's issues are becoming an issue for the company.

A bucket elevator in a feed mill is vital in production since it controls corn grains' transfer. There is no
accurate method to measure the excess corn grains falling into the bucket elevator's boot; thus, the losses
counted are through hours allotted for repair. As the bags produced increase, the number of excess corn
grains also increases as there is no significant machine improvement. Eventually, the unprocessed corn
grains will also damage the financial aspect of smaller manufacturers.

With the machine operating for long periods of time, the falling corn can clog and stop the entire bucket
elevator as it accumulates in the boot. As a result, the operators will temporarily stop the operation from
removing the build-up of corn. To tackle this problem, the proponents designed various systems to
determine the bucket speed and corn build up inside the elevator. With their design, the operators of the
machine can monitor the status of the bucket elevator and remove the accumulated corn before it clogs
and forces the operation to stop. 

These design proposals are essential for monitoring and detecting any changes inside the bucket elevator
that mainly revolves around the speed, corn grains' level at the bucket elevator's boot to reduce company
losses furthermore. Specifically, when the boot reaches a crucial level of grains, the design will
automatically unclog the boot and notify the operators about the status inside with the aid of LED light and
an Excel Log file containing the status and kind of incident inside the boot.  

Project Objectives

This project aims to create a systems design simulation using LabView software. The design simulation
shall monitor and detect speed changes from the bucket elevator. The general objective of the project is to
design a reliable Bucket Elevator Monitoring System. Specifically, the project aims to:

 Design a simulation-based monitoring system.


 Detect and monitor slippage and clogging of the corn mill bucket elevator.
 Provide indicators that will alert the machine operators for reduced speed.

The Client

The client of the Bucket Elevator Maintenance Monitoring and Alarm System is the Sorosoro Ibaba
Development Cooperative (SIDC). Since the proponents’ objective is to help smaller manufacturers
improve their feed production. The group chose SIDC after the group researched for an agricultural
company that looks to meet the growing standard of feed production by mitigating the losses incurred by
the machines they bought. 

Furthermore, SIDC’s ISO certification is proof that they aim to be a world-class manufacturer in the
Philippines through its manufacturing processes. With the growing business and interest to expand, SIDC
agreed to have our group help them solve these issues.

Project Scope

The project detects changes in the bucket elevator and monitors the accumulation of corn grains falling at
the boot. The system will notify the operator on the status of the bucket elevator and the device will
programmatically unclog the boot before reaching a dangerous level through the system indicators. 

The project design will only detect the bucket elevator belt speed, the excess corn grains at the boot and
alert the operators. The design implies that all manual checking for the maintenance underlying inside the
corn mill bucket elevator and unclogging still relies on the plant operators and engineers. 

Project Limitations

For confidentiality reasons, SIDC did not reveal their exact production data, and the proponents only
obtained data through interviews with SIDC's engineers. Moreover, SIDC stated that there should be no
significant construction changes made in the bucket elevator with the proposed simulation design. The
simulation is explicitly made for SIDC's bucket elevator and will only focus on it. 
Moreover, the excess corn falling off the bucket elevator cannot be measured precisely, so SIDC explained
that they measure the losses in terms of the time it took to resolve the clogging and slipping. 

Due to the bucket elevator's technological constraints and depending on the machine's supplier, the
simulation is only a monitoring system. The unclogging system is excluded in the design but considered in
the program for further developments. SIDC cannot provide images and videos needed for the design
options due to structural and confidentiality issues. 

Project Development

Figure 1-4 System Architecture for Project Development

Problem Definition

The first part of the project aims to study and determine the client’s issues that need to be solved. The
problem is further defined if a simulation is an appropriate tool.

Data Gathering

The project aims to research processes in related fields such as Electronics and Mechanical Engineering to
help the proponents better understand the problem. The group will also assess various parameters needed
in brainstorming possible solutions. These parameters include, but are not limited to, operations of a bucket
elevator, the process of frequency detection, and other sensors that can help design. The proponents will
quickly identify which of the following project parameters need the utmost priority by gathering the required
data. 

Conceptual Planning

Once the proponents have discussed all possible solutions, the group will trim down the options into the
most viable ones to study and analyze better approaches to improve the initial solution. These ideas will
supplement the researched data in formulating the best approach to this project. With the formulated
approaches, the proponents can form the necessary steps needed to meet their objectives.
System Definition

The system is often very complex; thus, identifying the system is an important step. It involves defining the
system components to be modelled and the performance measures to be analyzed.  

Model Formulation

The solutions will be divided based on the specifications that will fit the objectives of the research. The
group will then look into the project’s technical aspect to create a model that can accomplish their set
goals. 

Simulation

After highlighting various solutions, the proponents will apply their knowledge to build a virtual simulation on
existing designs that have a similar objective to their study. In meeting the client requirements, the
proponents will use which design option will yield the most accurate results. By doing so, the proponents
will secure an efficient, accurate, and reliable design that will improve its quality through various tests. This
part is also where the proponents will conduct tests to help them meet their client’s set standards.

Design Improvement

Lastly, the proponents will make further adjustments to meet the client’s requirements and improve the
design’s accuracy and effectiveness. For further improvements, the system should incorporate various
operationally realistic aspects before being finalized for operational testing.
CHAPTER 2 – Design Inputs

In a feed mill, the transport of corn grains from the processing plant to the storage silo needs to be efficient.
During the transfer of the processed grains, the elevator’s vertical motion shakes some processed grains
off the buckets and causes clogging. The excessive accumulation of corn at the boot causes the elevator
belt to slip off, forcing the operators to pause production to disassemble and unclog the elevator. There is
no way for the operators to foresee the clogging until the bucket elevator stops on its own. Constant and
accurate monitoring is required to reduce the frequency of these incidents. Furthermore, sensors attached
to the elevator are vital in accomplishing this task as it will detect changes in the bucket elevator’s operation
and excess corn accumulating in the base. After the system has detected unaccounted changes in the
boot’s amount and the corn grains’ flow rate, the design will automatically unclog at the boot to avoid
disrupting the operation that might cause more problems. 

Client Requirements

 The project should detect and monitor the corn mill bucket elevator’s clogging and slippage and
send an alarm to the machine operators and engineers.
 Process Time: The system’s response from the time the sensors detect trouble should take not
more than 3 seconds.
 Cost-effectiveness of the device: The device should not exceed the amount of ₱ 50,000.
 The life expectancy is five years. 

Design Criteria and Constraints 

Design Criteria Design Constraints Value 

Efficiency Response Time Time ≤ 3 seconds

Performance Reliability 0 ≤ AFR < 65

Economic  Projected Cost Cost ≤ ₱ 50,000.

Table 2-1 Design Criteria and Constraints


Response Time

This design constraint focuses on the consistency of the maximum time of process—lesser process time is
better for the design simulation; with that, it should notify the operator faster regarding all the irregularities
happening inside the bucket elevator. Once the boot system reaches high levels of corn grains, it should
immediately start unclogging it to avoid disrupting the bucket elevator's operation. However, the time it took
for the system to unclog is not considered and out of scope. 

Response Time=Time processed−Time detected

Reliability

Reliability constraint assures that the output of the device is consistent, and the device is dependable
anytime. The device's output should be consistent, which means that it generates the same output on the
overall system when performing multiple iterations. A high level of reliability of the components is essential
to keep idling times to a minimum. A component or a whole system's reliability is measured by the mean
time between failures (MTBF). (High Availability and Disaster Recovery, 2006) However, Annualized
Failure Rate or AFR gives more transparency as to how reliable the system is. It gives a better view of the
chance of a drive failure in any given year. AFR is computed from the mean time between failures (MBTF)
itself, as shown below:

Failed Attempts
Failures ∈Time ( FIT )=
Total Operating Time
1
MBTF=
Failures∈Time( FIT )

−Life Expectancy
(
AFR ( % )= 1−e MBTF ) × 100

Projected Cost

The estimated costs of the project’s materials and components from actual prices in the market are
considered on the economic criteria. The overall circuitry’s total costs will be recorded and compared to
each specific alternative design to determine what design alternative would be the most economically
feasible among the three (3) designs. The cost production of the device should fit the budget set by the
client. Hence, the cost should not exceed the amount of PHP 50,000.00.
Comparison of Related Literature

Table 2-2. Related Research Studies

Year of
Title Similarities Differences
Publication

International Papers

2019 Case Study – Safety Measures The study highlights Sensors were installed
Applied on Bucket Elevators  monitoring the in the bucket elevator
bucket elevator using itself and around it.
Nikola Ilanković, Dragan Živanić, Atila sensors.
Zelić, Srđan Savić The control system is
mainly infeed. Also,
this paper detects belt
slipping.

2019 Analysis of Belt Bucket Elevator  The paper This paper focuses
discusses the on what specific belt
Mg Than Zaw Oo, Ma Myat Win essential design, shaft &
Khaing, Ma Yi Yi Khin parameters used in pulley, is best used in
corn bucket a feed mill.
elevators.

2016 Analysis Of Belt Bucket Elevator The paper discusses This paper focuses on
the essential what specific belt
Mg Than Zaw Oo, Ma Myat Win, parameters used in design, shaft & pulley
Khaing, Ma Yi Yi Khin corn bucket is best to use in a feed
elevators. mill.

2016 Material Optimization And Modal The research utilizes The research
Analysis Of Elevator Bucket the vibration from the emphasizes the
bucket elevator to construction of
Sunil Tukaram Shinde, Vidyadhar obtain various efficient
Sudarshan Dixit, Masnaji Rajaram, natural elevator buckets.
Shailesh S. Pimpale frequencies when it
carries different
materials.

2015 Design And Model Of Bucket of The Study Considers This paper focuses on
Bucket Elevator the motion of the a fabricated version of
bucket elevators at bucket elevators for
Amit Jaiswal various heights. lifting wheat or any
bulk material.

2013 Failure Analysis of Belt Conveyor The paper discusses This paper determines
Damage Caused By The Falling the effects of the conditions of the
Material materials falling in a conveyor belt
conveyor belt depending on the
Gabriel Fedorko, Vieroslav The reduction of weight of the sharp
Molnar, Daniela Marasova, Anna conveyor belt
Grincova, Miroslav Dovica, Jozef damage from falling
Zivcak, Teodor Toth, Nikoleta materials
Husakova

2012 A Review On Design And Analysis Of The study features This paper deals with
Bucket Elevator the The design of the
adverse effects of bucket elevator parts
Snehal Patel, Sumant Patel, Jigar concentrated stress
Patel on a conveyor belt.

2010 Intelligent Belt Conveyor Monitoring This book discusses Requires significant
and Control the automation of a reconstruction on the
monitoring system in belts for implementing
conveyor belts. its automation
Yuson Pang
It also presents
various possibilities
to improve belt
performance

2009 Failure analysis of belt conveyor This paper The study is limited to
systems emphasizes condition monitoring of
condition monitoring horizontal conveyor
Radoslaw Zimroz, Robert Król on conveyor belt belts only
systems.

Consider the scale


and frequency of
error in a belt
system.

2007 Hazard Monitoring Equipment Explains the The manual uses only
malfunctions and Force Switches, Rub-
Selection, Installation, and errors of a typical blocks, Limit Switches,
Maintenance bucket elevator. and Magnetic sensors
The manual to control the bucket
Johnny Wheat discusses the elevator’s operation.
hazards brought by
these errors.

Local Papers

2017 Design and Implementation of an The study accounts It requires various


Automated Fish Feeder  Robot  for  for the weight of motors to synchronize
the  Philippine Aquaculture materials carried by and function
the conveyor belt for completely
Industry: Feeding Mechanism and distribution.
Float Design Module

Maria Cizel U. Deroy, Aeus Joshua


Espaldon, Johanna Ericka F. Osa,
Gian Paulo B. Macalla and Diogenes
Armando D. Pascua

2012 Obstacle Detector It is used for the Obstacle Detector V.2


detection of focuses on detecting
Henry A. Barramoda, Kenny Lois H. obstacles or the obstacle or
Sabillo hindrances in a hindrances for the
specific area. blind and visually
impaired, while the
The paper uses an proponents’ focuses
IR sensor as an on detecting the
essential component obstacle for a corn mill
of the device. bucket elevator.

2010 Bucket Elevator Manual The manual gives It has no


the specifications of troubleshooting guide
Intersystems the bucket elevators for each type for each
used by the client. bucket elevator.

Explains the various


types of bucket
elevators.

2008 Bucket Elevator and Conveyors Compares the pulley The paper compares
sizes of various the weight capacity of
GSI Group bucket elevators buckets.

2006 Design of Security Alarm System To further notify the focuses on detecting
Using SMS Technology device’s break-ins and
Operators/Owners if executing alarms in
Michael A. Cardenas, Ronilo A. there has been a Business and
Dimzon, Glenn B. Empeno, Huebert complication in its residential properties,
D. Gutierrez, Harvey G. Porcel-Lee,  surroundings. while the proponents’
Rheyan N. Santos, Sheryll V. paper focuses on
Savellano, Laurence B. Sojuaco executing alarms
inside a feed mill plant
if there has been any
blockage or leakage
inside the bucket
elevator.

Design Specifications

The proponents considered maximizing the use of the following platforms as a means of simulation: 
Figure 2-1 National Instruments LabVIEW
 
Laboratory Virtual Instrument Engineering Workbench (LabVIEW) is an interactive program and
execution system composed of configurable virtual instruments designed to accomplish tasks more
straightforward than conventional programming. Using LabVIEW, the user can originate precisely the type
of virtual instrument needed, and programmers can easily view and modify data or control inputs. 

Figure 2-2 Dassault Systèmes SOLIDWORKS

SolidWorks is a solid modelling computer-aided design and a computer-aided engineering


computer program that runs primarily on Microsoft Windows. It is a powerful 3D CAD software and product
design tool for modern-day product 3D designers and engineers.

CHAPTER 3 – Project Design 


For this chapter, the proponents will deeply tackle all of the three proposed designs concerning the
conducted study. Every approved design option will be incorporated with a visual representation of its
circuitry and simulation that depicts the study's overall performance and functionality; Furthermore, this
chapter will elaborate on all of the components and methods involved in generating the design simulation.  

Figure 3-1 Tree Diagram for Design Options

Figure 3-1 shows that the first option is to simulate a monitoring system with pure infrared. The tachometer,
pointed at the drum pulley, measures the speed. This tachometer has no direct contact with the moving
parts and only uses IR signals to successfully measure the rotating drum pulley's speed. It is incorporated
with IR sensors to detect whether the product has reached the height limit and avoid clogging issues at the
boot. The need for multiple sensors equally spaced from one another is highly recommended to avoid false
positives. False positives mean only one or two sensors detect the height limit reached, which could mean
that only a specific side or portion of a side of the boot has most of the products, which may or may not
affect belt operations. 

The next simulation design detects the boot's clogging using an Ultrasonic Sensor and an embedded
tachometer to monitor the bucket elevator’s height and speed of the bucket elevator. Once the clogging
reaches the height limit, the unclogging mechanism will help clear the said corn grains that may hinder
production time.

Lastly, the simulation design will use image processing. Using the videos and images curated, it will
undergo video and image processing, respectively. The video processing will measure the speed, while the
image processing will determine the corn grain level. Due to limitations, the proponents will only use the
designed simulation of a scaled bucket elevator similar to the client’s existing design using SolidWorks. 

DESIGN OPTION 1 Using Infrared Tachometer and Infrared Sensor


Design Simulation

NI LabView is a graphical programming language that helps visualize hardware configuration,


measurement data, and debugging easier. It is an interactive program and execution system composed of
configurable virtual instruments designed to accomplish tasks more straightforward than conventional
programming. The program makes it simple to design custom engineering user interfaces and represent
complex logic on the diagram.

Figure 3-2 NI LabView Front Panel for Design Option 1

Figure 3-2 shows the front panel of the designed simulation for Design Option 1 that consists of purely IR
components. The front panel shows the Controls, Sensors, Operation time, System Log, and Indicators of
the program and serves as the graphical user interface.  
Figure 3-3 Design Option 1 Simulation Process

Figure 3-3 shows the simulation process of design option 1. The front panel has two vertical toggle
switches with pre-selected default state off.  The power toggle button must be pressed to start the
simulation, and the power LED must turn green. When the button is pressed, its value is 1, and the system
functions. When it is not pressed, its value is 0, and the system is non-operational.  Before initiating the
corn grains release, the bucket speed must reach the average speed of 83 m/min indicated by the
speedometer and the green LED labelled "Speed OK.” The system log starts as soon as the simulation
runs. The process inside is shown by the LED indicators from the “Boot OK”, ”Speed OK”, and System
Status. Green signifies if the flow is normal, Orange when it reaches the warning state, and Red for the
critical state. Clearing LED indicator turns on when the level reaches the warning and critical state.

ISO/IEC 25010:2011(n): Systems and


software engineering— Systems and
software Quality Requirements and
Evaluation (SQuaRE) — System and
software quality models
IEEE: 2700-2014: Standard
for Sensor Performance
Parameters Definitions
Figure 3-4 NI LabView Block Diagram for Design Option 1

Figure 3-4 shows the design simulation's whole block diagram with its main components revolving around
Infrared sensors. For the block diagram of NI Labview, the proponents used codes by graphical
representations to take full control of the objects represented in the front panel. This block diagram contains
the graphical source code of the program. It is composed of sub-Vis, Functions, Terminals, Constants,
wires, and structures that interconnect for data transfer.  

Figure 3-5 Power On and OFF logging


Figure 3-5 illustrates how the System Log operates behind the front panel. With this block diagram’s aid,
the date, time, bucket status, and system status are displayed in the front panel and Excel data log.

Figure 3-6 Operation Time Computation (runtime)


Figure 3-6 depicts how “Operation Time” works behind the front panel. This block diagram shows that as
soon as the operator starts the simulation, it will continuously track and compute the simulation's operating
time in hours until the operator decides to turn it off.

Figure 3-7 Date and Time Grabber

Figure 3-7 shows how it uses the computer’s current date and time and syncs it into the system to have
proper logging.

Figure 3-8 Error Odds

Since the simulation is built to perfection, the proponents decided to make room for error. Figure 3-8 shows
the block diagram responsible for conducting the percent chances of error for the design simulation,
roughly 1 in 1000.
Figure 3-9 Value Handlers

Figure 3-9 shows the invisible value handlers present in the front panel; it is used to speed up data transfer
from one block diagram to another.

Figure 3-10 Speedometer

Figure 3-10 shows the block diagram responsible for the Bucket Elevator's Acceleration front panel’s visual
representation. This block diagram shows the Bucket Elevator's current speed that it can range once the
design simulation has started. In some instances, the bucket elevator operates at a reduced speed, such
as when boot clogging is detected.
Fi
gure 3-11 Critical Error Inducer

Figure 3-11 is the block diagram behind the “Manual Override” option in the front panel. It is also
responsible for making errors to occur, be it by chance or forced. It is accompanied by numerous
mathematical operations in numeric, randomizer for the chance, and F-format strings to successfully
transfer the gathered data to be recorded in the activity logs.

Figure 3-12 Default Speed of the Bucket Elevator

Figure 3-12 shows the block diagram demonstrating the default value of the speedometer based on the
client’s current bucket elevator, which is 83 m/min.

Figure 3-13 CSV File Maker

Figure 3-13 shows the block diagram that makes the CSV file for data logging, which is essential for
knowing the downtime duration, the time it happened, belt speed during the downtime, and the system’s
status. It is also a requirement for the response time calculation.
Figure 3-14 Volumetric Throughput

Figure 3-14 shows how the final computation of the volumetric throughput from the front panel is reflected
on the block diagram. Considering that the bucket is always 75% filled, three parameters were needed to
compute the Volumetric throughput: bucket speed, cubic meter per meter of the belt, and the simulated
bucket volume.

Figure 3-15 Front Panel Essential Operators

Figure 3-15 shows the vital execution parts of the first design option simulation; it mainly consists of Front
Panel logic, which is in charge of the necessary operating conditions when starting and running the first
design simulation. The next part of it is the one that caters to how the system log will show its results in the
front panel based on the data it has gathered throughout the whole operation time. Lastly, the Error reset is
repeatedly refreshing the system once the error has occurred and stabilized by the design simulation.
ISO 18251-1:2017: Non-destructive
testing — infrared thermography — Part
Figure 3-16 Infrared sensors 1: Characteristics of system and
equipment
Figure 3-16 shows how the sensors from the simulated bucket elevator from the front panel work. The
Infrared sensors resonate in detecting the descent of corn grains during their operation time. During
operation, the grains mainly piled up at the first and fourth sensors’ location, followed by the gradual
accumulation on to the spot of the second and third sensors until it reaches its warning/critical point, which
then activates the pusher via loop to restore it in its stabilized form.

Figure 3-17 Excel file of Alert Log 

Figure3-17 shows the sample tabulated log of every activity during the operation time of the design option
simulation. This log depicts the time, threat level of the alert, actions being done, and lastly, the belt speed
of the design simulation when the threat level of the alert happened.
Efficiency

Response Time=Time processed−Time detected

Table 3-1 Response Time of Design Option 1

Response Time of Design Option 1


Average Response Time 1.4266 secs
*See Appendix C for calculations of precision
IEEE 1633-2016 — IEEE
Table 3-1 shows the average response time of Design Option 1.
Recommended Practice on
Software Reliability
Performance
Failed Attempts
Failures ∈Time ( FIT )=
Total Operating Time
1
MBTF=
Failures∈Time( FIT )

−Life Expectancy
(
AFR ( % )= 1−e MBTF ) × 100
Table 3-2 Reliability of Design Option 1

Reliability of Design Option 1


Failures in Time (FIT) Life Expectancy Mean Time Between Failure Annualized Failure Rate (%)
(Failure per hour) (Years) (Hours)
0.2083 5 4.8 64.71

Table 3-2 shows the summary of the annualized failure rate for design option 1. Design Option 1’s AFR is
64.71% which is the highest among the three. With a FIT of 0.2083 failures per hour it resulted to a mean
time between failures of 4.8 hours in a 5-year life expectancy.

Economic 

Table 3-3 Economic Feasibility of Design Option 1 

Costing Breakdown of Design Option 1

Component Quantity Unit Price Total Price

Single Mode Power Supply 24V, 12V, 5V 1 2000.00 2000.00


(150W)
Coaxial Cable 1 1000.00 1000.00

Resistor 5 5.00 25.00

Capacitor 5 10.00 50.00

Crystal Unit 5 20.00 100.00

Microcontroller Unit (Raw Chip) 1 104.76 104.76

Transistor-Transistor Logic Converter 1 100.00 100.00

Infrared Sensor (Proximity) 4 150.00 600.00

IR Tachometer with Serial Interface (RS 232) 1 20000.00 20000.00

High Torque Motor with Gearbox Assembly 1 5000.00 5000.00

Motor Driver 1 200.00 200.00

Aluminum Sheet (5mmx100mmx100mm) 2 863.67 1727.34

Aluminum Sheet (3mmx100mmx100mm) 2 824.07 1648.14

5m USB A to B Cable 1 150.00 150.00

Coding Rate 1 5000.00 5000.00

TOTAL 37705.24

*See Appendix C for calculations


Critical Path Method

Critical Path Method is a sequence of the scheduled activities that determines the duration of the project. It
is a technique to find critical deadlines and deliver a project on time. CPM identifies the critical tasks, time-
wise, in completing the project. It enables the proponents to identify the project's minimum amount of time
to attain the goal.

Table 3-4 Design Option 1 Project Activities

Activity Activity Description Activity Predecessors Activity Duration (Weeks)


A Project Planning - 2
B Software Familiarization A 1
C Programming B 2
D Code Testing C 1
E Code Debugging C 1
F Simulation Assembly C, D 1
G Design Improvement E, F 1
H Testing G 1

Table 3-4 shows the project activities for design option 1 to be used in critical path method analysis.

Figure 3-18 Network Activity for Design Option 1

Figure 3-18 shows the network activity for design option 1 to be used in critical path method analysis.

Where:

Early Start (ES): The earliest date the activity can start
Early Finish (EF): The earliest date that the activity can be delayed
Float (FF): The maximum number of days the activity can be delayed without delaying any
succeeding activity
Late Finish (LF): The latest date that the activity can finish without causing a delay to the project
completion date.
Late Start (LS): The latest date that the activity can start without causing a delay to the project
completion date.

The first step in the calculation process in the Critical Path Method is known as Forward Pass. In the
forward pass, Early Start and Early Finish are calculated. Early Start for the first activity on Critical Path is
always 1.

Early Start = Early Finish of Predecessor activity + 1


Early Finish = Activity Duration + Early Start -1
Float = Late Start – Early Start

The next calculation process is known as Backward Pass. In Backward Pass, Late Start and Late Finish
are calculated. Late Finish of the last activity on Critical Path is always the Early Finish of the same activity.

Late Start = Late Finish of activity – Activity Duration + 1


Late Finish = Late Start of successor activity - 1

Figure 3-19 Critical Path Network Diagram for Design Option 1’s entire schedule of activities

Figure 3-19 shows the network diagram for design option 1 to be used in critical path method analysis.

Table 3-5 Critical Paths of Design Option 1


Path Events Length of Each Path Total Duration (Weeks)
1 A+B+C+D+F+G+H 2+1+2+1+1+1+1 9
2 A+B+C+F+G+H 2+1+2+1+1+1 8
3 A+B+C+E+G+H 2+1+2+1+1+1 8

Table 3-5 shows the possible path taken by the research. The proponents have plotted three (3) paths
taken to have the research's ideal course without compromising the stages essential in preparing and
producing the research.
Table 3-6 Total Time Duration of Design Option 1
Activity Precedent Duration Early Start Early Finish Latest Start Latest Finish Float
A - 2 1 2 1 2 0
B A 1 3 3 3 3 0
C B 2 4 5 4 5 0
D C 1 6 6 6 6 0
E C 1 6 6 7 7 1
F C, D 1 7 7 7 7 0
G E, F 1 8 8 8 8 0
H E, G 1 9 9 9 9 0
Project 9

Table 3-6 shows the summary of the research's ideal duration, which will last nine weeks. The critical path
network includes Project Planning, Software Familiarization, Programming, Code Testing, Simulation
Assembly, Testing, and Design Improvement.
DESIGN OPTION 2 Using Embedded Tachometer and Ultrasonic Sensor

Design Simulation 

For the second design option, the proponents still considered the use of NI Labview because this graphical
programming language helped ease the visualization of the design option utilizing hardware configuration,
data measurement, and debugging. Using LabVIEW, the proponents can originate precisely the type of
virtual instrument needed and can easily view and modify data or control inputs. Each LabVIEW function is
designed with a user interface that offers real-time compilation and lets users interact with the code
immediately after writing it. 

Figure 3-20 NI LabView Front Panel for Design Option 2

Figure 3-20 shows the front panel of the designed simulation for Design Option 2 that mainly consists of a
contactless tachometer and ultrasonic sensor.
Figure 3-21 Design Option 2 Simulation Process

Figure 3-21 shows the working operations as soon as the simulation was turned on and the grains’ input
has started. The process inside is shown by the LED indicators from the “Boot OK”, ”Speed OK”, and
System Status. Green signifies if the flow is normal, Orange when it reaches the warning state, and Red for
the critical state. Clearing LED indicator turns on when the level reaches the warning and critical state.

ISO/IEC 25010:2011(n): Systems and


software engineering— Systems and
software Quality Requirements and
Evaluation (SQuaRE) — System and
software quality models
Figure 3-22 NI LabView Block Diagram for Design Option 2

Figure 3-22 shows the complete block diagram of the design option simulation no.2 that heavily revolves
around the use of Embedded Tachometer and Ultrasonic Sensors. The proponents maximized the use of
NI Labview and its codes in terms of graphical representation for them to take full control of the objects
present in the front panel. This compilation of block diagrams comprises Sub VI’s, Functions, Terminals,
Constants, Wires, and all the crucial structures needed for data transfer.

Figure 3-23 Power On and Off Logging

Figure 3-23 shows how the System Log operates in the front panel with the aid of different arrays and
strings; the date, time, bucket status, and system status can be displayed.

Figure 3-24 Operation Time Computation


Figure 3-24 shows how the “Operation Time” from the front panel works. This block diagram depicts that as
soon as the operator starts the simulation, it will continuously compute the design simulations’ runtime
hours until the operator decides to shut it down.

Figure 3-25 Date and Time Grabber

Figure 3-25 shows how the current computer’s date and time are being synced into the design simulation
with the aid of different arrays and strings.

Figure 3-26 Error Odds

Figure 3-27 incorporates how the proponents used the random number array to facilitate percent chances
of error for the design simulation in terms of “Critical” and “Warning.”
Figure 3-27 Value Handlers

Figure 3-27 shows the invisible Value handlers present in the front panel of the design simulation. It mainly
consists of transfer functions and string constants to speed up data transfer from one block diagram to
another.
Figure 3-28 Speedometer

Figure 3-28 shows the visual representation of the bucket elevator’s acceleration from the front panel once
it is turned on. Its speed is represented in m/min.

Figure 3-29 Critical Error Inducer

Figure 3-29 is the block diagram responsible for the “Manual Override” option in the front panel. This block
diagram configures the override of the simulation to start a critical state forcibly. All sensors will
automatically clog up under this setting, resulting in a reduced speed of 41.5 m/min, starting the unclogging
operation.

3-30 Default Values

Figure 3-30 shows the maximum values that the speed and height limit of the sensors from the design
simulation can attain.
Figure 3-31 CSV File Maker

Figure 3-31 shows how the data logging of the design simulations starts at the strip path to set its file
directory then transferred into the file path, where all of the data gathered from the design simulation will be
filled out.
Figure 3-32 Volumetric Throughput Computation

Figure 3-32 shows the numeric elements’ symphony on how the volumetric throughput of the design
simulation is calculated once the corn grains are inside the bucket elevator.

Figure 3-33 Front Panel Logic

Figure 3-33 shows the vital execution parts of the first design option simulation; it mainly consists of Front
Panel logic, which is in charge of the necessary operating conditions when starting and running the first
design simulation. The next half of it caters to how the “System log” will show the front panel results based
on the operation’s data. Lastly is the Error reset; this panel is repeatedly refreshing the system once the
error has occurred and stabilized by the design simulation.
IEEE: 2700-2014: Standard
for Sensor Performance
Parameters Definitions

ISO/TS 16829:2017: Non-


destructive testing —
Automated Ultrasonic Testing

Figure 3-34 Symphony of Embedded Tachometer and Ultrasonic Sensors

Figure 3-34 shows how the Embedded Tachometer and Ultrasonic Sensors from the front panel work. The
process starts when the speedometer is at its constant speed and when the corn grains are placed inside.
With the aid of the “millisecond timer value,” the gathered data from the clogged grains will be relayed into
the two kinds of “millisecond timer value” of the unclogger, which is “500-millisecond timer” value for the
warning state and “2000- millisecond timer” for the critical condition. Each sensor has an “F-format string”
to directly log the data in the “Syslog” to be transferred to the file path responsible for excel logging. 
Figure 3-35  Excel file of Alert Log 

Figure 3-35 depicts the Excel data logfile from the Design Option Simulation. This figure highlights the
current time, the Threat level of the alert, the Status of the Design Option simulation, the accumulated
height of corn grains from each sensor, and the belt speed when the alert happened.

Efficiency
Response Time=Time processed−Time detected
Table 3-7 Response Time of Design Option 2

Response Time of Design Option 2


Average Response Time 1.4325 secs
*See Appendix C for calculations of precision

Table 3-7 shows the summary of the average response time for design option 2. 

Performance
Failed Attempts
Failures ∈Time ( FIT )=
Total Operating Time
1
MBTF=
Failures∈Time( FIT )
IEEE 1633-2016 —
−Life Expectancy IEEE Recommended
(
AFR ( % )= 1−e MBTF ) × 100 Practice on Software
Table 3-8 Reliability of Design Option 2 Reliability

Reliability of Design Option 2


Failures in Time (FIT) Life Expectancy Mean Time Between Failure Annualized Failure Rate (%)
(Failure per hour) (Years) (Hours)
0.1667 5 6 56.54

Table 3-8 shows the summary of the annualized failure rate for design option 2.  Design Option 2’s AFR is
56.54% which is still considered high. With a FIT of 0.1667 failures per hour it resulted to a mean time
between failures of 6 hours in a 5-year life expectancy.
Economic 

Table 3-9 Economic Feasibility of Design Option 2

Costing Breakdown of Design Option 2

Component Quantity Unit Price Total Price


Single Mode Power Supply 24V, 12V, 5V 1 2000.00 2000.00
(150W)

135m Coaxial Cable 1 1000.00 1000.00

Resistor 5 5.00 25.00

Capacitor 5 10.00 50.00

Crystal Unit 5 20.00 100.00

MCU ( Raw Chip) 1 104.76 104.76

Transistor-Transistor Logic Converter 1 100.00 100.00

Ultrasonic Sensor 4 220.00 880.00

Speedometer 1 1000.00 1000.00

High Torque Motor with Gearbox Assembly 1 5000.00 5000.00

Motor Driver 1 200.00 200.00

Aluminum Sheet (5mmx1000mmx1000mm) 2 863.67 1727.34

Aluminum Sheet (3mmx1000mmx1000mm) 2 824.07 1648.14

5m USB A to B Cable 1 150.00 150.00

Coding Rate 1 5000.00 5000.00

TOTAL 18985.24

*See Appendix C for calculations


Critical Path Method 

Critical Path Method is a sequence of the scheduled activities that determines the duration of the project. It
is a technique to find critical deadlines and deliver a project on time. CPM identifies the critical tasks, time-
wise, in completing the project. It enables the proponents to identify the project's minimum amount of time
to attain the goal.

Table 3-10 Design Option 2 Project Activities

Activity Activity Description Activity Predecessors Activity Duration (Weeks)


A Project Planning - 2
B Software Familiarization A 1
C Programming B 2
D Code Testing C 1
E Code Debugging  C 1
F Simulation Assembly C, D 1
G Design Improvement E, F 1
H Testing G 1

Table 3-10 shows the project activities for design option 2 to be used in critical path method analysis.

Figure 3-36 Network Activity for Design Option 2

Figure 3-36 shows the network activity for design option 2 to be used in critical path method analysis.

Where:

Early Start (ES): The earliest date the activity can start
Early Finish (EF): The earliest date that the activity can be delayed
Float (FF): The maximum number of days the activity can be delayed without delaying any
succeeding activity
Late Finish (LF): The latest date that the activity can finish without causing a delay to the project
completion date.
Late Start (LS): The latest date that the activity can start without causing a delay to the project
completion date.

The first step in the calculation process in the Critical Path Method is known as Forward Pass. In the
forward pass, Early Start and Early Finish are calculated. Early Start for the first activity on Critical Path is
always 1. 

Early Start = Early Finish of Predecessor activity + 1


Early Finish = Activity Duration + Early Start -1
Float = Late Start – Early Start 

The next calculation process is known as Backward Pass. In Backward Pass, Late Start and Late Finish
are calculated. Late Finish of the last activity on Critical Path is always the Early Finish of the same
activity. 

Late Start = Late Finish of activity – Activity Duration + 1


Late Finish = Late Start of successor activity - 1 

Figure 3-37 Critical Path Network Diagram for Design Option 2’s entire schedule of activities

Figure 3-37 shows the network diagram for design option 2 to be used in critical path method analysis.

Table 3-11 Critical Paths of Design Option 2


Path Events Length of Each Path Total Duration (Weeks)
1 A+B+C+D+F+G+H 2+1+2+1+1+1+1 9
2 A+B+C+F+G+H 2+1+2+1+1+1 8
3 A+B+C+E+G+H 2+1+2+1+1+1 8

Table 3-11 shows the possible path taken by the research. The proponents have plotted three (3) paths
taken to have the research's ideal course without compromising the stages essential in preparing and
producing the research.
Table 3-12 Total Time Duration of Design Option 2
Activity Precedent Duration Early Start Early Finish Latest Start Latest Finish Float
A - 2 1 2 1 2 0
B A 1 3 3 3 3 0
C B 2 4 5 4 5 0
D C 1 6 6 6 6 0
E C 1 6 6 7 7 1
F C, D 1 7 7 7 7 0
G E, F 1 8 8 8 8 0
H E, G 1 9 9 9 9 0
Project 9

Table 3-12 shows the summary of the research's ideal duration, which will last nine weeks. The critical path
network includes Project Planning, Software Familiarization, Programming, Code Testing, Simulation
Assembly, Testing, and Design Improvement.
DESIGN OPTION 3 Using Image Processing

Design Simulation

For the last Design Option Simulation, the proponents still considered the use of NI Labview because of its
graphical programming language which eases the visualization of the Design Option Simulation that heavily
utilizes hardware configuration, debugging, and data measurement. With the aid of NI Labview, the
proponents can precisely pin-point the type of virtual instrument needed for them to easily modify and view
the data or even control its inputs. Every LabVIEW function is designed with a user interface that offers its
users a real-time compilation of results and even lets them interact with the code immediately after making
it.

Figure 3-38 NI LabView Front Panel for Design Option 3

Figure 3-38 shows the front panel of the designed simulation for Design Option 3 utilizing image processing
and mainly consisting of simulations from Solidworks.
Figure 3-39 Design Option 3 Simulation Process

Figure 3-39 shows the simulation process of Design Option Simulation as soon as the simulation starts
running. The process inside is shown by the LED indicator from the “Boot OK”, green signifies if the flow is
normal, orange when it reaches the warning state, and red for the critical state. Clearing LED indicator
turns on when the level reaches the warning and critical state.

ISO/IEC 7942-1: Information


Technology – Computer Graphics
and Image Processing – Graphical
Kernel System (GKS) 

Figure 3-40 NI LabView Block Diagram for Design Option 3

Figure 3-40 shows the complete block diagram used in the making Design Option Simulation no.3 that
mainly revolves around maximizing Cameras and Image processing. The proponents relied on NI LabView
to fully showcase the simulation and its codes in terms of graphical representation to fully control the
objects.

Figure 3-41 CSV File Maker

Figure 3-41 shows the data logging of the Design Option Simulations. It starts at the strip path to lead its
way into the directory then transferred into the file path. File Path will gather all of the Design Option
Simulation's data for it to be filled out.

Figure 3-42 Volumetric Throughput Computation

Figure 3-42 shows the symphony of numeric elements present in the Design Option Simulation volumetric
throughput. Volumetric throughput is calculated once the corn grains are inside the bucket elevator.
Figure 3-43 Value Handlers

Figure 3-43 shows the invisible Value Handlers present in the Design Option Simulation’s front panel.
Transfer functions and string constants make up the value handlers to speed up its transfer from one block
to another.

Figure 3-44 Default and Basis

Figure 3-44 shows the use of Strings and Booleans to set the default value as the timing basis for Image
Processing.

Figure 3-45 Date and Time Grabber

Figure 3-45 shows the syncing of the computer’s current date in time into the Design Option Simulation.
This process was made possible by the different sets of arrays and strings visible in the figure.
Figure 3-46 Error Odds

Figure 3-46 showcases the proponents’ number array utilization to facilitate the percent chances of error for
the Design Option Simulation’s status, “Critical” and “Warning.”

Figure 3-47 Belt Speed Image Processing

Figure 3-47 shows the utilization of how the Design Option Simulation focuses on the frame, which is
processed in real-time. After gathering the images needed, the block diagram converts the image or frame
into the grayscale then binary image to ease the process in determining the bucket pattern. This block
diagram also shows a trigger that detects the vital frame for speed computation. The While Loop in the
second acquisition contains some of the essential parts necessary for the boot.

Figure 3-48 Boot Operations

Figure 3-48 shows the block diagram responsible for representing the boot in the front panel. This figure
shows the image’s conversion or frame into grayscale then into a binary image for the background and
foreground change. The proponents estimate the percentage filled inside the boot. Once the simulated corn
grains reach 80% to 90% fill, a signal will be triggered for the status of “Warning” or “Critical,” which is from
the next part of the loop representing the unclogger. 

Efficiency
Response Time=Time processed−Time detected

Table 3-13 Response Time of Design Option 3

Response Time of Design Option 3


Average Response Time 0.0072 secs
*See Appendix C for calculations of precision
IEEE 1633-2016 — IEEE
Table 3-13 shows the average response time of Design Option 3.
Recommended Practice on
Software Reliability
 Performance
Failed Attempts
Failures ∈Time ( FIT )=
Total Operating Time
1
MBTF=
Failures∈Time( FIT )

−Life Expectancy
AFR ( % )=(1−e MBTF ) × 100
Table 3-14 Reliability of Design Option 3

Reliability of Design Option 3


Failures in Time (FIT) Life Expectancy Mean Time Between Failure Annualized Failure Rate (%)
(Failure per hour) (Years) (Hours)
0.0417 5 23.98 18.82

Table 3-14 shows the summary of the annualized failure rate for design option 3.  Design Option 1’s AFR is
18.82% which is the lowest and most reliable design option. With a FIT of 0.0417 failures per hour it
resulted to a mean time between failures of 23.98 hours in a 5-year life expectancy.
Economic 

Table 3-15 Economic Feasibility of Design Option 3 

Costing Breakdown of Design Option 3

Component Quantity Unit Price Total Price

Single Mode Power Supply 24V, 12V, 5V 1 2000.00 2000.00


(150W)

Coaxial Cable 1 1000.00 1000.00

Resistor 5 5.00 25.00

Capacitor 5 10.00 50.00

Crystal Unit 5 20.00 100.00

Microcontroller Unit (Raw Chip) 1 104.76 104.76

Transistor-Transistor Logic Converter 1 100.00 100.00

CCTV or Webcam 2 1000.00 2000.00

High Torque Motor with Gearbox Assembly 1 5000.00 5000.00

Motor Driver 1 200.00 200.00

Aluminum Sheet (5mmx1000mmx1000mm) 2 863.67 1727.34

Aluminum Sheet (3mmx100mmx100mm) 2 824.07 1648.14


5m USB A to B Cable 1 150.00 150.00

Coding Rate 1 7000.00 7000.00

TOTAL 21105.24

*See Appendix C for calculations

Critical Path Method 

Critical Path Method is a sequence of the scheduled activities that determines the duration of the project. It
is a technique to find critical deadlines and deliver a project on time. CPM identifies the critical tasks, time-
wise, in completing the project. It enables the proponents to identify the project's minimum amount of time
to attain the goal.

Table 3-16 Design Option 3 Project Activities

Activity Activity Description Activity Predecessors Activity Duration (Weeks)


A Project Planning - 2
B Software Familiarization A 1
C Programming B 4
D Code Testing C 1
E Code Debugging  C 1
F Simulation Assembly C, D 1
G Design Improvement E, F 1
H Testing G 1

Table 3-16 shows the project activities for design option 3 to be used in critical path method analysis.
Figure 3-49 Network Activity for Design Option 3

Figure 3-49 shows the network activity for design option 3 to be used in critical path method analysis.

Where:

Early Start (ES): The earliest date the activity can start
Early Finish (EF): The earliest date that the activity can be delayed
Float (FF): The maximum number of days the activity can be delayed without delaying any
succeeding activity.
Late Finish (LF): The latest date that the activity can finish without causing a delay to the project
completion date.
Late Start (LS): The latest date that the activity can start without causing a delay to the project
completion date.

The first step in the calculation process in the Critical Path Method is known as Forward Pass. In the
forward pass, Early Start and Early Finish are calculated. Early Start for the first activity on Critical Path is
always 1. 

Early Start = Early Finish of Predecessor activity + 1


Early Finish = Activity Duration + Early Start -1
Float = Late Start – Early Start 

The next calculation process is known as Backward Pass. In Backward Pass, Late Start and Late Finish
are calculated. Late Finish of the last activity on Critical Path is always the Early Finish of the same
activity. 

Late Start = Late Finish of activity – Activity Duration + 1


Late Finish = Late Start of successor activity - 1 

Figure 3-50 Critical Path Network Diagram for Design Option 3’s entire schedule of activities

Figure 3-50 shows the network diagram for the design option 3 to be used in critical path method analysis.
Table 3-17 Critical Paths of Design Option 3
Path Events Length of Each Path Total Duration (Weeks)
1 A+B+C+D+F+G+H 2+1+4+1+1+1+1 11
2 A+B+C+F+G+H 2+1+4+1+1+1 10
3 A+B+C+E+G+H 2+1+4+1+1+1 10

Table 3-17 shows the possible path taken by the research. The proponents have plotted three (3) paths
taken to have the research's ideal course without compromising the stages essential in preparing and
producing the research.

Table 3-18 Total Time Duration of Design Option 3


Activity Precedent Duration Early Start Early Finish Latest Start Latest Finish Float
A - 2 1 2 1 2 0
B A 1 3 3 3 3 0
C B 4 4 7 4 7 0
D C 1 8 8 8 8 0
E C 1 8 8 9 9 1
F C, D 1 9 9 9 9 0
G E, F 1 10 10 10 10 0
H E, G 1 11 11 11 11 0
Project 11

Table 3-18 shows the summary of the research's ideal duration, which will last weeks. The critical path
network includes Project Planning, Software Familiarization, Programming, Code Testing, Simulation
Assembly, Testing, and Design Improvement.
CHAPTER 4 – Design Constraints, Trade-offs, and Standards

For this chapter, in order for the proponents to determine the ideal solution from the design simulations
presented, they will thoroughly examine and discuss the most suitable design simulation's determination on
par with the client's requirements. In determining the most feasible design, the proponents maximized the
use of Sensitivity Analysis and Pareto Optimality to fully apply the ranking theorem and further analyze the
design that aligns with the criterion that best suits the final design simulation implementation.

Constraints

Response Time

This design criterion focuses on the consistency of the maximum time of process—lesser process time is
better for the design simulation; with that, it should notify the operator faster regarding all the irregularities
happening inside the bucket elevator. Once the boot system reaches high levels of corn grains, it should
immediately start unclogging it to avoid disrupting the bucket elevator's operation. However, the time it took
for the system to unclog is not considered.

Response Time=Time processed−Time detected

Reliability

Reliability constraint assures that the output of the device is consistent, and the device is dependable
anytime. The device's output should be consistent, which means that it generates the same output on the
overall system when performing multiple iterations. A high level of reliability of the components is essential
to keep idling times to a minimum. A component or a whole system's reliability is measured by the mean
time between failures (MTBF). (High Availability and Disaster Recovery, 2006) However, Annualized
Failure Rate or AFR gives more transparency as to how reliable the system is. It gives a better view of the
chance of a drive failure in any given year. AFR is computed from the mean time between failures (MBTF)
itself, as shown below:

Failed Attempts
Failures ∈Time ( FIT )=
Total Operating Time
1
MBTF=
Failures∈Time( FIT )

−Life Expectancy
(
AFR ( % )= 1−e MBTF ) × 100
Economic

This part of design constraints includes the costs associated with the development. The proponents have
taken into account a low-cost device solely to meet the client’s budget. Thus, the cost should not exceed
the amount of PHP 50,000.00.out of scope. The raw component’s price is based on DEECO electronics,
Alexan Commercial, E-Gizmo & Sparkfruit online. The lowest cost in the design alternatives will be the
winner of the trade-off.

Trade-offs

Further evaluation of design alternatives is needed in choosing the best design to be implemented in the
project using trade-off analysis. Moreover, these are based on engineering requirements, standards, and
constraints. Considering the multiple constraint requirements, not all possible solutions can achieve the
project design's specified requirements. For the proponents to determine which design alternative can
satisfy the constraints' demand, they will be presenting the trade-offs. This section shows the approaches
used in trading off the bucket elevator monitoring and unclogging system. The criteria for trade-offs are
quantified for numerical comparisons.

Sensitivity Analysis

To fully determine the most feasible design simulation solution, the Theorem of Sensitivity Analysis was
applied with the following equations to rank the constraints enumerated from Chapter 2 quantitatively:

Design Ranking Value= {[ 1−( Governing Value−Value ¿ be ranked ¿¿ Governing Value ) ] ×10−5 }

However, at times when the governed value is negative, the proponents can use:

Design Ranking Value=5−{[ 1− (Governing Value−Value ¿ be ranked ¿¿ Governing Value ) ] × 10 }

Table 4-1 Design Criteria using Sensitivity Analysis 


Design Option
Level of
Criteria % Weight
Importance Option Option Option
Rank Rank Rank
1 2 3
-
Efficiency 5 41.67% 0.258 0.1075 0.225 0.0938 -4.976
2.0733
- -
Performance 4 33.33.% -2.0916 -1.6713 5.000 1.6667
0.6972 0.5571
Economic  -
3 25.00% -2.541 1.203 0.3008 0.779 0.1948
Scale 0.6353
Overall Rank -1.225 -0.1625 -0.2118
*Reference: Otto, K.N. and Antonsson, E. K., (1991). Trade-off strategies in engineering design. Research
in Engineering Design, volume 3, number 2, pages 87-104 . Retrieve from
http://www.design.caltech.edu/Research/Publications/90e.pdf on March 11, 2013
*See Appendix C for Calculations.

Table 4-1 depicts the sensitivity analysis values in each design simulation concerning its criterion listed.
Evidently, from the table, Design Option Simulation 1 outperformed all of the other options in terms of the
Efficiency criterion with a rank value of 0.1075, making it the most favorable option concerning Response
Time. Moreover, under the Performance criterion, Design Option Simulation 3 has the least amount of
annualized Failure rate of 1.6667, making it the most suitable under performance criteria. Under this
analysis, Design Option Simulation 2 gave the least amount of monetary expenses to be used, making it
the most suitable option under the Economic Scale criterion with a rank value of 0.3008. Design Option
Simulation 2 performed the best in terms of Economic Scale and Overall ranking for the overall result.
However, it is Efficiency and Performance was outshined by the other Design Option Simulations.
To further analyze this situation and identify the ideal Design Option Simulation, the proponents will
maximize the use of Sensitivity analysis where the level of importance is modified. This method is done
further to determine the behavior of Design Option Simulation 2 when the level of importance of the criteria
changes.

Table 4-2 Design Criteria using Sensitivity Analysis with Efficiency criterion having the highest level of
importance value

Level of Design Option


Criteria % Weight
Importance
Option 1 Rank Option 2 Rank Option 3 Rank
Efficiency 5 55.56% 0.258 0.1433 0.225 0.3269 -4.976 -2.7644
- -
Performance 2 22.22% -2.0916 -1.6713 5.000 1.1111
0.4488 0.3714
Economic  -
2 22.22% -2.541 1.203 0.2673 0.779 0.1731
Scale 0.5647
Overall Rank -0.8702 0.2228 -1.4802
*Reference: Otto, K.N. and Antonsson, E. K., (1991). Trade-off strategies in engineering design. Research
in Engineering Design, volume 3, number 2, pages 87-104 . Retrieve from
http://www.design.caltech.edu/Research/Publications/90e.pdf on March 11, 2013
*See Appendix C for Calculations. 

Table 4-2 clearly shows that Design Option Simulation 2 dominated all the other design simulations when
the highest level of importance was given to Efficiency regarding each Design Option Simulation’s
Response Time, making it the most viable Efficiency option. However, under the performance criterion,
Design Option Simulation 3 still dominates the overall rankings with a value of 1.1111. Furthermore, under
the Economic Scale criterion, Design Option Simulation 2 scored the most outstanding ranking in terms of
least expenses with a value of 0.2673. With  Design Option Simulation 2 having the best Response Time, it
got a first-place under the overall rankings compared to the other Design Option Simulations. It garnered an
overall score of 0.2228, making it the ideal Design when The Efficiency Criterion has the highest level of
importance. 

Table 4-3 Design Criteria using Sensitivity Analysis with Performance criterion having the highest level of
importance value
Design Option
Level of
Criteria % Weight
Importance Option Option Option
Rank Rank Rank
1 2 3
-
Efficiency 2 22.22% 0.258 0.0573 0.225 0.05 -4.976
1.1058
-
Performance 5 55.56% -2.0916 -1.162 -1.6713 5.000 2.7778
0.9285
Economic  -
2 22.22% -2.541 1.203 0.2673 0.779 0.1731
Scale 0.5647
Overall Rank -1.6694 -0.6112 1.8451
*Reference: Otto, K.N. and Antonsson, E. K., (1991). Trade-off strategies in engineering design. Research
in Engineering Design, volume 3, number 2, pages 87-104 . Retrieve from
http://www.design.caltech.edu/Research/Publications/90e.pdf on March 11, 2013
*See Appendix C for Calculations.
 
Table 4-3 highlights the performance of each design simulation. It clearly illustrates that Design Simulation
3 outperformed all the other design options in terms of Annualized Failure Rate. However, under the
Efficiency criterion, Design Option Simulation 1 dominated the ranking with a value of 0.0573. Furthermore,
when it comes to the Economic Scale Criterion Design Option 2 showed the least use of monetary
expenses, therefore giving it a ranking of 0.2673. Design Option Simulation 3 showcasing the least
annualized failure rate. Compared to the other Design Option Simulations, it got first place with an overall
rank of 1.8451, making it the most viable option when the highest level of importance focuses on
Performance.

Table 4-4 Design Criteria using Sensitivity Analysis with Economic Scale criterion with the highest level of
importance value
Design Option
Level of
Criteria % Weight
Importance Option Option Option
Rank Rank Rank
1 2 3
-
Efficiency 2 22.22% 0.258 0.0573 0.225 0.05 -4.976
1.1058
- -
Performance 2 22.22% -2.0916 -1.6713 5.000 1.1111
0.4488 0.3714
Economic  -
5 55.56% -2.541 1.203 0.6683 0.779 0.4328
Scale 1.4117
Overall Rank -1.8032 0.3469 -0.4275
*Reference: Otto, K.N. and Antonsson, E. K., (1991). Trade-off strategies in engineering design. Research
in Engineering Design, volume 3, number 2, pages 87-104 . Retrieve from
http://www.design.caltech.edu/Research/Publications/90e.pdf on March 11, 2013
*See Appendix C for Calculations.
Table 4-4 showcases the Economic Feasibility of each Design Option Simulation. It clearly illustrates that
Design Option Simulation 2 outperformed all of the other design options in the means of the total cost being
₱21,105.24≤ ₱50,000.00. Design Option Simulation 1 has terrible performance in terms of Performance
and Economic Scale; it compensates on the Efficiency criterion with a ranking of 0.0573. Design Option
Simulation 3 displayed a favorable execution within this analysis for Performance with a ranking of 1.1111.
Furthermore, under Economic Scale, Design Option Simulation 2 performed the greatest compared to the
other Design Option Simulations with an overall rank of 0.3469, making it the best Option when the level of
importance is highly prioritized at the Economic Scale.

Table 4-5 Design Criteria using Sensitivity Analysis with all criteria having the highest level of importance
value

Design Option
Level of
Criteria % Weight Option Option Option
Importance Rank Rank Rank
1 2 3
-
Response Time 5 33.33% 0.258 0.086 0.225 0.075 -4.976
1.6587
- - - -
Reliability 5 33.33% 5.000 1.6667
2.0916 0.6972 1.6713 0.5571
Economic
5 33.33% -2.541 -0.847 1.203 0.401 0.779 0.2597
Scale
Overall Rank -1.4582 -0.0811 -0.2517
*Reference: Otto, K.N. and Antonsson, E. K., (1991). Trade-off strategies in engineering design. Research
in Engineering Design, volume 3, number 2, pages 87-104 . Retrieve from
http://www.design.caltech.edu/Research/Publications/90e.pdf on March 11, 2013
*See Appendix C for Calculations

Table 4-5 shows the analysis of the following criteria: Response Time, Reliability, and Economical
Feasibility, with their level of importance being equally divided. Based on the gathered results, Design
Option Simulation 1 will be the best option for the Response Time criterion under this analysis, While under
the Reliability Criterion, Design Option Simulation 3 is highly recommended with its rank value being
1.6667. However, under the Overall ranking, Design Option Simulation 2 outperformed both of the Design
Option Simulations in terms of total productivity with an overall rank of -0.0811, making it the best
recommendation in total monetary expenses saved.

Table 4-6 Design Criteria using Sensitivity Analysis with Efficiency criterion having with the highest level of
importance value

Criteria Level of % Weight


Design Option
Importance
Option Rank Option Rank Option Rank
1 2 3
-
Efficiency 5 55.56% 0.258 0.1433 0.225 0.125 -4.976
2.7644
- -
Performance 3 33.33% -2.0916 -1.6713 5.000 1.6667
0.6972 0.5571
Economic  -
1 11.11% -2.541 1.203 0.1337 0.779 0.0866
Scale 0.2823
Overall Rank -0.8362 -0.2984 -1.1843
*Reference: Otto, K.N. and Antonsson, E. K., (1991). Trade-off strategies in engineering design. Research
in Engineering Design, volume 3, number 2, pages 87-104 . Retrieve from
http://www.design.caltech.edu/Research/Publications/90e.pdf on March 11, 2013
*See Appendix C for Calculations.

Table 4-6 shows the analysis of the following criteria: Efficiency, Performance, and Economic Scale
concerning their level of importance starting from five (5), three (3), and one (1). The level of importance is
set to determine each criterion's behavior, while its values are varying. This analysis shows that the
Efficiency criterion’s behavior is being heavily determined. Design Option Simulation 3 is the ideal option in
terms of Performance with a rank value of 1.6667. Under Economic scale, the table illustrates that Design
Option Simulation 2 is still the most viable option when it comes to Economic Scale. However, when it
comes to the Efficiency Criterion, Design Option Simulation 1 outshines the other options with a rank of
0.1433. Even with the most suitable Design Option Simulation for Efficiency is Design Option 1, Design
Option 2 still leads the way with the highest overall rank of -0.2984 when it comes to Efficiency with the
highest level of importance.

Table 4-7 Design Criteria using Sensitivity Analysis with Performance criterion with the highest level of
importance value

Design Option
Level of
Criteria % Weight Option Option Option
Importance Rank Rank Rank
1 2 3
0.028 -.0.552
Efficiency 1 11.11% 0.258 0.225 0.025 -4.976
7 9
-
Performance 5 55.56% -2.0916 -1.162 -1.6713 5.000 2.7778
0.9285
Economic 
3 33.33% -2.541 -0.847 1.203 0.401 0.779 0.2597
Scale
Overall Rank -1.9803 -0.5025 2.4846
*Reference: Otto, K.N. and Antonsson, E. K., (1991). Trade-off strategies in engineering design. Research
in Engineering Design, volume\number 2, pages 87-104 . Retrieve from
http://www.design.caltech.edu/Research/Publications/90e.pdf on March 11, 2013
*See Appendix C for Calculations
Table 4-7 shows the analysis of the following criteria: Efficiency, Performance, and Economic Scale
concerning their level of importance starting from one (1), five (5), and three (3). This analysis primarily
prioritized Performance's design criterion to be analyzed for the proponents to determine its behavior. The
results clearly showed that Design Option Simulation 3 outperformed all the other options in an annualized
failure rate rank of 2.7778. Under the Efficiency criterion, Design Option Simulation 1 showed dominance
with a rank of 0.0287 under 1.1111. However, under the Economic Scale Criterion Design Option
Simulation 2 outperformed all the other options with a rank of 0.401. With this analysis, Design Option
Simulation 3 is the most viable option with an overall rank of 2.4846. 

Table 4-8 Design Criteria using Sensitivity Analysis with Economic Scale criterion having the highest level
of importance value
Design Option
Level of %
Criteria Option Option Option
Importance Weight Rank Rank Rank
1 2 3
-
Efficiency 3 33.33% 0.258 0.086 0.225 0.075 -4.976
1.6587
- -
Performance 1 11.11% -2.0916 -1.6713 5.000 0.5556
0.2324 0.1857
Economic  -
5 55.56% -2.541 1.203 0.6683 0.779 0.4328
Scale 1.4117
Overall Rank -1.5581 0.5576 -0.6703
*Reference: Otto, K.N. and Antonsson, E. K., (1991). Trade-off strategies in engineering design. Research
in Engineering Design, volume 3, number 2, pages 87-104 . Retrieve from
http://www.design.caltech.edu/Research/Publications/90e.pdf on March 11, 2013
*See Appendix C for Calculations
Table 4-8 showcases the analysis of the following criteria: Efficiency, Performance, and Economic Scale
concerning their level of importance starting from three (3), one (1), and five (5). Under this analysis, the
proponents primarily prioritized the design criterion of the Economic Scale. Based on the results, Design
Option Simulation 2 leads the way in terms of Economic Scale with its rank value of 0.6683 while Design
Option 1 is the most viable in Efficiency with a rank of 0.086. Under the Performance criterion, Design
Option Simulation 3 peak the highest with a rank of 0.5556. Based on the Analysis from this table, when it
comes to the Economic Scale, having the highest level of Importance Design Option Simulation 2 still
outperforms all of the other options with an overall rank of 0.5576.

Table 4-9 Design Criteria using Sensitivity Analysis with all criteria having the highest level of importance
value
Design Option
Level of
Criteria % Weight Option Option Option
Importance Rank Rank Rank
1 2 3
-
Efficiency 3 33.33% 0.258 0.086 0.225 0.075 -4.976
1.6587
- -
Performance 3 33.33% -2.0916 -1.6713 5.000 1.6667
0.6972 0.5571
Economic 
3 33.33% -2.541 -0.847 1.203 0.401 0.779 0.2597
Scale
Overall Rank -1.4582 -0.0811 0.2677
*Reference: Otto, K.N. and Antonsson, E. K., (1991). Trade-off strategies in engineering design. Research
in Engineering Design, volume 3, number 2, pages 87-104 . Retrieve from
http://www.design.caltech.edu/Research/Publications/90e.pdf on March 11, 2013
*See Appendix C for Calculations
Table 4-9 showcases the analysis of the following criteria: Efficiency, Performance, and Economic Scale,
with their level of importance being equally divided. As observed from the results, Design Option Simulation
1 will be the best option for the Design criterion of Efficiency with a top rank of 0.086; on the other hand, for
the design criterion of performance, Design Option Simulation 3 is highly recommended with its rank of
1.6667. The design criterion of Economic Scale Design Option Simulation 2 came out as the most
recommended option with a rank value of 0.401. Under this table, Design Option Simulation 3 came out as
the best option with the highest overall rank of 0.2667.

Table 4-10 Design Criteria using Sensitivity Analysis with Efficiency criterion with the highest level of
importance value
Design Option
Level of
Criteria % Weight
Importance Option Option Option
Rank Rank Rank
1 2 3
-
Efficiency 4 57.14% 0.258 0.1474 0.225 0.1286 -4.976
2.8434
- -
Performance 2 28.57% -2.0916 -1.6713 5.000 1.4286
0.5976 0.4775
Economic 
1 14.29% -2.541 -0.368 1.203 0.1719 0.779 0.1113
Scale
Overall Rank -0.8182 -0.177 -1.3035
*Reference: Otto, K.N. and Antonsson, E. K., (1991). Trade-off strategies in engineering design. Research
in Engineering Design, volume 3, number 2, pages 87-104 . Retrieve from
http://www.design.caltech.edu/Research/Publications/90e.pdf on March 11, 2013
*See Appendix C for Calculations
Table 4-10 highlights the analysis of the following criteria: Efficiency, Performance, and Economic Scale
concerning their level of Importance starting from four(4), two(2), and one (1) and in contrast, focusing on
the Efficiency criterion to thoroughly examine the design simulations' behavior when tested under some
numerous changes in Importance. As generated by the results, Design Option Simulation 1 ranks the
highest in Efficiency with a rank of 0.1474 while Design Option Simulation 3 taking the Performance
criterion with a rank of 1.4286 out of 2.857. However, Design Option Simulation 2 still peaked the highest
from both of the other options in the Economic Scale with a rank value of 0.1719 and an overall rank of
-0.177. Overall Design Option Simulation 2 is the most viable option under this set of varying levels of
Importance.
Table 4-11 Design Criteria using Sensitivity Analysis with Performance criterion with highest level of
importance value
Design Option
Level of %
Criteria
Importance Weight Option Option Option
Rank Rank Rank
1 2 3
0.032 -
Efficiency 1 14.29% 0.258 0.0369 0.225 -4.976
1 0.7109
-
Performance 4 57.14% -2.0916 -1.6713 -0.955 5.000 2.8571
1.1952
Economic  0.343
2 28.57% -2.541 -0.726 1.203 0.779 0.2226
Scale 7
Overall Rank -1.8843 -0.5792 2.3688
*Reference: Otto, K.N. and Antonsson, E. K., (1991). Trade-off strategies in engineering design. Research
in Engineering Design, volume 3, number 2, pages 87-104 . Retrieve from
http://www.design.caltech.edu/Research/Publications/90e.pdf on March 11, 2013
*See Appendix C for Calculations
Table 4-11 showcases the analysis of the design criteria of Efficiency, Performance, and Economic Scale
concerning their level of importance starting from one (1), four (4), and two (2). The proponents primarily
focused on the design criterion of Performance to examine each design option simulation's behavior when
tested under some numerous changes in importance. Clearly, from the results, Design Option Simulation 1
is the most viable option to have the most minimal cost to be produced with a rank value of 0.0369 out of
1.429. While under the Performance criterion, Design Option Simulation 3 is an ideal option to be produced
with a rank value of 2.8571. For the Economic Scale criterion, Design Option Simulation 2 has the least
monetary expenses to be produced with a rank of 0.3437. Overall Design Simulation 3 is the most suitable
option under this varying level of importance and with an overall rank of 2.3688

Table 4-12 Design Criteria using Sensitivity Analysis with Economic Scale criterion with highest level of
importance value
Design Option
Level of %
Criteria
Importance Weight Option Option Option
Rank Rank Rank
1 2 3
-
Efficiency 2 28.57% 0.0737 0.0369 0.225 0.0643 -4.976
1.4217
- -
Performance 1 14.29% -2.0916 -1.6713 5.000 0.7143
0.2988 0.2388
Economic  4 57.14% -2.541 -1.452 1.203 0.6874 0.779 0.4451
Scale
RankOverall  -1.7139 0.5129 -0.2623
*Reference: Otto, K.N. and Antonsson, E. K., (1991). Trade-off strategies in engineering design. Research
in Engineering Design, volume 3, number 2, pages 87-104 . Retrieve from
http://www.design.caltech.edu/Research/Publications/90e.pdf on March 11, 2013

Table 4-12 showcases the analysis of the design criteria of Efficiency, Performance, and Economic Scale
concerning their level of importance starting from two (2), one (1), and four (4). Under this analysis, the
proponents primarily prioritized the Economic scale's design criterion to determine its behavior while being
examined under the sensitive changes of importance level. Based on the results, Design Option Simulation
3 performed really well in terms of the least value for annualized failure rate with a rank value of 0.7143
under the Performance Criterion. Furthermore, as shown in the Table, Design Option Simulation 2
Completely overpowered all of the other options under the Efficiency and Economic Scale criteria with a
rank of 0.0643 and 0.6874. Overall Design Option Simulation 2 is the most ideal and viable option under
this varying level of importance with an overall rank of 0.5129.
Pareto Optimization

Pareto Optimization and Sensitivity Analysis are being utilized most of the time to prove that a design is
feasible under quantitative manners. This method analyzes the two parameters from the constraints to a
thorough comparative analysis to examine the constraints' different behaviors. This method's primary use is
to identify the lacking parameters that need more improvement and optimization to achieve the ideal
feasible design simulation

Efficiency and Performance

Table 4-13 Summarized data for Efficiency and Performance


Design Option Efficiency Performance

Design Option1 - X1 1.4266 sec 64.71

Design Option2 - X2 1.4325 sec 56.54

Design Option3 - X3 0.0072 sec  18.82

Table 4-13 illustrates the mathematical values from Economic and Efficiency while being compared from
Design Option 1 to Design Option 3.

Decision Space

Constraints:
1.4266X1 + 1.4325X2 + 0.0072X3 ≤ 3 64.71X1 + 56.54X2 + 18.82X3 ≥ 0

Minimum Efficiency Minimum Performance


f1= 0.0072 sec X3                                                                 f2= 18.82 X3
Figure 4-1 Efficiency versus Performance Design Criterion Space 
Figure 4-1 clearly shows that Design Option 3 is the most feasible Design Option Simulation for the
Efficiency with a response time of 0.0072 seconds. On the performance criteria, Design Option 3 gave the
least amount of annualized failure rate with a value of 18.82%. Design Option 3 is a viable option that can
be implemented as the solution. Furthermore, since Design Option 3 has dominated both concerned
criteria, the criterion space is no longer required to be determined. 

Performance and Economic Criteria

Table 4-14 Summarized Data for Performance and Economic


Design Option Performance Economic

Design Option1 - X1 64.71 37,705.24

Design Option2 - X2 56.54 18,985.24

Design Option3 - X3 18.82 21,105.24

Table 4-14 shows the summary of conducted results from Performance and Economic of the proposed
design option simulations (See Appendix B for the computation of Performance and Economic)

Decision Space

Constraints:
64.71X1 + 56.54X2 + 18.82X3 ≥ 0 37705.24X1 + 18985.24X2 + 21105.24X3 ≤ 50000

Minimum Performance Minimum Economic


f1 = 18.82% X3                                                        f2 = 18,985.24 X2
Figure 4-2 Performance versus Economic Design Criterion Space 
Figure 4-2 clearly shows that Design Option 3 is the most feasible Design Option Simulation for the
Performance criterion with an annualized failure rate of 18.82%. On the Economic criteria Design Option 2
gave the least amount of expenses with an estimated value of 18,985.24 . By making Design Option 2 and
Design Option 3 as the lead references, the proponents managed to apply the Pareto Optimal Line. Pareto
Optimal line serves as a guide to further identify if the other design option simulation is still a feasible option
hence; using the proper application trade-off, the proponents can meet the objectives set by their client.

Criterion Space

For the proponents to determine the criterion space, the main objective function should be plotted that
serves as the third vertex for the criterion space. To conduct this, the proponents will identify the
coordinates of the main objective function with a condition of whether that function is minimized or
maximized. 

Objectives

Minimize:                                                                  Minimize:                            

Economic Scale                                                        Performance

f1 ≤ 50000;                                                                      f2 ≤ 0;

Coordinates (x, y)
Figure 4-3 Performance and Economic Scale Criteria Decision Space

Figure 4-3 illustrates that by implementing the coordinates of Design Option 2 and Design Option 3 with the
aid of the main objective functions, criterion space of Economic and Performance can be plotted and
visualized (color). All of the Design options under the plotted area can be considered as feasible. In
contrast, the other design options outside the range of the plotted area can be classified as not a viable part
of the solution for the objective. Since all of the design options can be covered by the plotted space, it
merely implies that all of the design options are considered to be a feasible solution for the objective.

Criterion Space Area:

1
Criterion Space= [ x ( y − y 3 ) + x 2 ( y 3 − y 1) + x 3 ( y 1− y 2 ) )
2( 1 2 ]
Table 4-16 Coordinates of the Economic Scale and Performance
X1 0 Y1 50000

X2 18.82 Y2 21105.24

X3 56.54 Y3 18985.24

For the proponents to successfully predict the criteria to be manipulated, the area of criterion space was
fully utilized. It implies that the chosen designs can be controlled for it to become suitable for the Pareto
Optimal line hence; enabling the design options for it to be compatible for manipulation while still catering
as a feasible solution for the objectives.
Criterion Space =
1
[ 2
( 0 ( 21105.24−18985.24 ) +18.82 ( 18985.24−50000 )+ 56.54 ( 50000−21105.24 ) ) ]
= 525005.9736

Economic Scale and Efficiency Criteria

Table 4-17 Summarized Data for Economic Scale and Efficiency 

Design Option Economic Efficiency

Design Option1 - X1 37,705.24 1.4266 sec

Design Option2 - X2 18,985.24 1.4325 sec

Design Option3 - X3 21,105.24 0.0072 sec

Table 4-17 shows the summary of conducted results from Economic and Efficiency of the proposed design
option simulations (See Appendix B for the computation of Economic and Efficiency)

Decision Space:

Constraints:
37705.24X1 + 18985.24X2 + 21105.24X3 ≤ 50000 1.4266X1 + 1.4325X2 + 0.0072X3 ≤ 3

Minimum Economic                                                            Minimum Efficiency


f1= 18,985.24 X2                                                              f2=0.0072 X3  
Figure 4-4 Efficiency and Economic Criteria Decision Space

Figure 4-4 clearly shows that Design Option 2 is the most feasible Design Option Simulation for the
Economic criteria. Design Option 2 gave the least amount of expenses with an estimated value of
18,985.24 pesos. On the Efficiency criteria, Design Option 3 gave the best response time by a value of
0.0072 seconds. By making Design Option 2 and Design Option 3 as the lead references, the proponents
managed to apply the Pareto Optimal Line. Pareto Optimal line serves as a guide to further identify if the
other design option simulation is still a feasible option hence; applying the proper application trade-off, the
proponents can meet the objectives set by their client.

Criterion Space:
For the proponents to determine the criterion space, the primary objective function should be plotted that
serves as the third vertex for the criterion space. To conduct this, the proponents will identify the
coordinates of the main objective function with a condition of whether that function is minimized or
maximized. 

Objectives

Minimize:                                                                  Minimize:                            

Economic Scale                                                        Efficiency

f1 ≤ 0;                                                                      f2≤ 0;

Coordinates (x, y)
Fi
gure 4-5 Efficiency and Economic Design Criterion space with color

Figure 4-5 illustrates that by implementing the coordinates of Design Option 2 and Design Option 3 with the
aid of the main objective functions, criterion space of Economic and Efficiency can be plotted and
visualized (color). All of the Design options under the plotted area can be considered as feasible. In
contrast, the other design options outside the range of the plotted area can be classified as not viable for
the objective. Since all of the design options can be covered by the plotted space, it merely implies that all
design options are considered a feasible solution for the objective.

Criterion Space Area:

1
Criterion Space= [ x ( y − y 3 ) + x 2 ( y 3 − y 1) + x 3 ( y 1− y 2 ) )
2( 1 2 ]

Table 4-18 Coordinates of the Economic Scale and Efficiency


X1 0.0072 Y1 21,105.24

X2 1.4325 Y2 18,985.24

X3 3 Y3 50,0000
Criterion Space =
1
[ 2
( 0.0072 ( 18985.24−50000 ) +1.4325 ( 50000−21105.24 ) +3 ( 21105.24−18985.24 ) ) ]
  = 23764.21871

For the proponents to successfully predict the criteria to be manipulated, the area of criterion space was
fully utilized. It implies that the chosen designs can be controlled for it to become suitable for the Pareto
Optimal line hence; enabling the design options for it to be compatible for manipulation while still catering
as a feasible solution for the objectives.

Table 4-19 Summary of Pareto Optimization of the Design Option


Design Option Design Option Simulation Design Option Simulation
Criteria Compared
Simulation 1 2 3

Efficiency vs Performance 0 0 2

Performance vs Economic 0 1 1

Economic vs Efficiency 0 1 1

Total 0 2 4

Table 4-19 shows the tabulated results from conducting the Pareto Optimization with a pointing system
essentialized by the set of rules based on the criteria satisfaction. For a Design Option Simulation to
Generate (2) points, both of the criteria compared shall be won over by the said Design Option Simulation.
For it to generate one (1) point, only a single criteria compared shall be won by the Design Option
Simulation, and lastly, a half-point (0.5) shall be awarded if the Design Option Simulation falls within the
Pareto Optimal Line.

The Design Option Simulations presented with quantitative data give the proponents a better chance of
determining the most ideal and feasible solution for the client’s problem. With the aid of Sensitivity Analysis
and Pareto Analysis, proponents were able to identify the most feasible Design Option Simulation that
achieves all of the objectives set in the research with a better development for a recommendation that will
be submitted to the client while strictly abiding by the client requirements with a proper trade-off. Sensitivity
Analysis determines the best viable solution concerning the Design Option Simulation's varying importance
level. Based on the results, Design Option Simulation 3 is the best candidate as a solution when the level of
importance starts varying, making it not too sensitive for the client's requirements. However, the data
gathered from Pareto analysis determines that Design Option Simulation 3 is the most viable solution for
the research objectives. This intervening option against Sensitivity Analysis makes the consideration of the
requirements set by the client. With the client requirement set that primarily prioritizes the Design Option
Simulation's Efficiency, the proponents concurred that the Design Option Simulation 3 is the best feasible
research objective. Moreover, based on the analyzed and gathered data, the proponents highly suggest
Design Option Simulation 3 as it has the most satisfactory performance when the level of importance
varied, which is beneficial especially for client requirements that need to be achieved. Design Option
Simulation 3 illustrated a versatile performance concerning Sensitivity Analysis that has shown four (4)
viable options with varying levels of importance.

Design Standards

IEEE: 2700-2014: Standard for Sensor Performance Parameters Definitions

Based on the standard it specifies a common framework for sensor performance specification terminology,
units, conditions and limits. It also fulfils the need for a common methodology for specifying sensor
performance that will ease the non-scalable integration challenges and defines a minimum set of
performance parameters, with required units, conditions, and distributions for each sensor.

ISO/IEC 25010:2011(n): Systems and software engineering — Systems and software Quality
Requirements and Evaluation (SQuaRE) — System and software quality models

This standard mainly involves the processes associated with requirements definition, verification, and
validation, focusing on the specification and evaluation of quality requirements. It describes how the quality
models can be used for software quality requirements and its purpose in the software quality evaluation
process. This International Standard can also be used in conjunction with ISO/IEC 15504 to provide a
framework for software product quality definition in the customer-supplier process and support for review,
verification, validation, and a framework for quantitative quality evaluation, in the support process.

IEEE 1633-2016 – IEEE Recommended Practice on Software Reliability 

The standard covers the methods for assessing and predicting the reliability of the software, based on a
life-cycle approach to software reliability engineering (SRE), are prescribed in this recommended practice.
It provides information necessary for the application of software reliability (SR) measurement to a project,
lays a foundation for building consistent methods, and establishes the basic principle for collecting the data
needed to assess and predict the reliability of software. The recommended practice prescribes how any
user can participate in SR assessments and predictions.

ISO 18251-1:2017: Non-destructive testing — infrared thermography — Part 1: Characteristics of system


and equipment

This International Standard emphasizes the effectiveness of any application of infrared thermography in the
field of industrial NDT. This document describes the main components and their characteristics, constituting
an infrared (IR) imaging system and related equipment used in non-destructive testing (NDT). It also aims
to assist the user in selecting an appropriate system for a particular measurement task.

ISO/TS 16829:2017: Non-destructive testing — Automated Ultrasonic Testing


This International Standard covers all kinds of ultrasonic testing on components or complete manufactured
structures for either correctness of geometry, material properties such as quality or defects and fabrication
methodology. This Standard correlates with the Design Option 2 which uses Ultrasonic  sensors to detect
blockage.

ISO/IEC 7942-1: Information Technology – Computer Graphics and Image Processing – Graphical Kernel
System (GKS) 

This standard is also applied in the design, and it specifies a set of functions for computer graphics
programming, which is the Graphical Kernel System (GKS). This also provides functions for two-
dimensional graphical output, the storage and dynamic modification of pictures, and operator input.
 Chapter 5 – Final Design

For the Maintenance Monitoring and Alarm system of the Bucket Elevator, the proponents considered the
use of three design option simulations. The first design option simulation maximized the use of a set of
Infrared Sensors; the first set of Infrared Sensors are used to successfully measure the rotation of the drum
pulley’s speed with the aid of an indicator, while the second set of Infrared Sensors were incorporated at
the boot to detect whether the corn grains have reached the height limit and to further avoid clogging
issues. The use of multiple sensors equal in space is essential due to False Positives, which means that
only a few sensors can detect the height limit reached on the specific portion of sides that may or may not
affect the belt operations. The IR sensors operate by beaming an IR that transmits to a specified location;
then, the photodiode will capture the reflected beam. The intensity of the IR light captured by the
photodiode will represent the data gathered between the target and sensor. 

Second design Option Simulation utilizes the use of the Ultrasonic Sensor and an Embedded Tachometer
in monitoring the height and the speed of the bucket elevator. The Ultrasonic Sensor operates by emitting a
sound wave to the target with a frequency above the range of human hearing; the transducer’s sensor will
act as a microphone to gather and emit the ultrasonic sound; after that, the sensor will determine the
displacement of the target by measuring the time gaps between the interactions of the ultrasonic pulse. On
the other hand, the Embedded Tachometer works by sensing the motor's speed without being in contact
with it with the aid of the principle of light transmission and reflection, which then generates a signal. The
signal, which is then is converted into an electric signal, is fed into the microcontroller that is programmed
to calculate the speed of the bucket elevator in terms of revolutions per minute. Once the clogging reaches
the height limit, the unclogging mechanism will help clear the said corn grains that might delay the client’s
production time.

Last Design Option Simulation uses Image Processing. With the aid of the videos and images Simulated
from SolidWorks the proponents were able to analyze the results and start the process of video and image
processing. The video processing will measure the speed, while the image processing will determine the
corn grain level. Once the grain level reaches a critical height limit, the unclogging mechanism will
automatically activate and start the clearing process to stop the client's production delay.

Through Sensitivity Analysis and Pareto Optimization on the three designs, the proponents conclude that
out of the three design options, Design 3 is the best suited for implementation. The design utilizes image
processing and mainly consists of simulations from SolidWorks meets the design criteria of efficiency,
performance and economic scale.
Winning Design

Figure 5-1 Design Option 3 using Image Processing Front Panel as a Winning Design Option

ISO/IEC 7942-1: Information


Technology – Computer Graphics
and Image Processing – Graphical
Kernel System (GKS) 
Figure 5-2 Design Option 3 using Image Processing Block Diagram  

Components 

Figure 5-3 Single Mode Power Supply 24V, 12V, 5V (150W)

Figure 5-3 acts as the converter of voltage and current characteristics since switched-mode power supply is
an electronic power supply that incorporates a switching regulator to convert electrical power efficiently. It is
also used as replacements for linear regulators when higher efficiency, smaller size or lighter weight is
needed.

Figure 5-4 Coaxial Cable

Figure 5-4 shows a Coaxial cable that serves as the transmission line to carry high-frequency electrical
signals with low losses from the camera to the control unit.
Figure 5-5 Resistor

Figure 5-5 shows a Resistor that is used to reduce current flow, adjust signal levels, divide voltages, bias
active elements, and terminate transmission lines to the microcontroller unit interface in preparation for the
motor driver.

Figure 5-6 Capacitor

Figure 5-6 shows a capacitor that is used in microcontroller unit interfaces for power supply smoothing,
timing by supplying power to it through a resistor and filtering by passing DC voltage to it.
Figure 5-7 Crystal Unit

Figure 5-7 shows a crystal unit that is used as a mechanical resonator since it is made up of high stability
piezoelectric quartz. It also generates clock signals which are very essential for the design to operate and
achieve high stability.

Figure 5-8 Microcontroller Unit (Raw Chip)

Figure 5-8 shows an Arduino Uno that interprets the videos and images captured by the camera and then
processed by the LabView software. Also controls the indicators of the design.
Figure 5-9 Transistor-Transistor Logic Converter

Figure 5-9 shows Transistor-transistor logic that converters are characterized by high switching speed, and
relative immunity to noise. TTL is used to employ transistors with multiple emitters in gates having more
than one input.

Figure 5-10 CCTV or Webcam

Figure 5-10 shows a CCTV or Webcam that will be efficient especially on Design Simulation 3 which is
focused on image processing. It will monitor the flow of corn grains to determine the clogging and
unclogging situation.
Figure 5-11 High Torque Motor with Gearbox Assembly

Figure 5-11 shows a High Torque Motor with Gearbox Assembly that consists of an electric DC motor and
a gearbox components. This gearbox is used to reduce the DC motor speed, while increasing the DC motor
torque. Therefore, the user can get lower speed and higher torque from the gear motor.

Figure 5-12 Motor Driver

Figure 5-12 shows a Motor driver that is used to control the high torque motor by taking a low-current
control signal and then turning it into a higher-current signal. 
Figure 5-13 Aluminum Sheet (5mmx1000mmx1000mm)

Figure 5-13 shows an Aluminum sheet that will be attached at the end of the high torque motor which will
act as the pusher to boot section.

Figure 5-14 Aluminum Sheet (3mmx1000mmx1000mm)

Figure 5-14 shows an Aluminum sheet with the given dimensions that will be used as casing of the control
unit.
Figure 5-15 5m USB A to B Cable

For the data logging, Figure 5-15 shows a USB A to B cable that will be used to connect the control unit to
the computer network. 

Testing

This part contains the set of experimentations done using Design Option Simulation 3. Methods of
Procedure from the Experiment will be shown accompanied by the tabular representations of every
experiment's result. This chapter will conclude with a summary of the recorded results and a final verdict on
the Design Option Simulation's efficiency.

Methods of Procedure

1. Open the LabView file of the Design Option Simulation.


2. Once opened, run the simulation.
3. Wait for the simulation to run a stable speed of 83m/min. 
4. When the speed reaches 83m/min, observe the filling of the grains from the front panel.
5. Once the grain reaches the 80%-90% level of the boot, boot clearing will start.
6. Check the excel file log for the status of the bucket elevator.

Simulation Process

Design Option Simulation 3 works by maximizing the use of two simulated videos from SolidWorks with the
aid of NI LabView while utilizing image processing. NI LabVIEW is responsible for processing both of the
frames present from the bucket elevator and its boot; once the processing is done for the frames, it will be
converted into grayscale and then into binary grayscale. After gathering all of the essential data, it will be
transferred to a series of loops for the final output display in the front panel and the status logging in the
excel file. 
Table 5-1 Performance of the Device

Trials  Efficiency
1 Success

2 Success

3 Success

4 Success

5 Success

6 Success

7 Success

8 Success

9 Success

10 Success

11 Success

12 Success

13 Success

14 Success

15 Success

16 Success

17 Success

18 Success

19 Success

20 Success

21 Success

22 Success

23 Success

24 Success

Performance 100%
Computation for the Performance of the device based on Efficiency: 

S
Performance= × 100
T

Where:
ISO/IEC 25010:2011(n): Systems and
S = number of successful trials
software engineering— Systems and
software Quality Requirements and
T = total number of trials Evaluation (SQuaRE) — System and
software quality models
Interpretation of Data

From the trials that were conducted for the efficiency of the device upon testing, it is evident that the device
was 100% efficient. As the table above shows, a success rate of efficiency, which results in a 100%
performance, Design Simulation 3 proved its effectiveness during the 10 trials performed. Given the same
randomizer used in all the simulations, only Design Simulation 3 shows a 0 failure during its run-time
performance.

Summary of Findings

This research effort has produced recommended design simulations in monitoring bucket elevator
operations. The primary objective of this research was to provide an efficient simulation-based bucket
elevator monitoring system. From the data gathered, the proponents highly recommend the efficiency of the
Design Option 3, which is a success compared to the other 2 design simulations based on the 10 trials
initiated. On the Economic criteria, Design Option 2 gave the least amount of expenses with an estimated
value of 18,985.24, which can also be considered due to its low-cost components.

Conclusion

Designing a system to monitor and detect the bucket elevator speed and build-up of the corn grains at the
boot address the client's challenges. Design option 1 uses Infrared (IR) tachometer and Infrared (IR)
sensors, while Design option 2 uses embedded (contactless) Tachometer and Ultrasonic sensors. Lastly,
Design option 3 utilizes the application of image processing. One way to decide the best design option is to
meet the criteria and their constraints. First is how efficient the design option is. Design option 3
outperformed the other two design options with 0.0072 seconds averaged response time. Second, the
design's annualized failure rate is 18.82%, the lowest among the three design options. Lastly, the projected
cost of 21,105.24 pesos falls within the budget limitation. 

After various testing and analyses with the three design options, it was concluded that Design option 3 is
the best system design. It is composed of CCTV or webcam utilizing the application of image processing for
both monitoring and detection. An additional advantage of this design is a real-time feed of the inside of the
bucket elevator. After numerous testing and analysis, design option three was observed to detect and
monitor the bucket elevator efficiently, economically, and reliable. 
Recommendation

The project has a promising outcome that leads into possible future innovations of the designs, the
proponents would recommend some improvements based on the outcome to the research as presented in
the data, results, and conclusions of this design project. It is encouraged to enhance the simulation to
achieve a more accurate replica of a bucket elevator’s operation and to enhance the image quality or use of
a higher resolution camera for the image processing, likewise a redesign for the unclogging system. Lastly,
to enhance the simulation for it to detect more specific issues in the bucket elevators like the product intake
of the machine.
References

Deroy, M. C., Espaldon, A. J., Osa, J. E., Macalla, G. P., & Pascua, D. A. (2017). Design and
Implementation of an Automated Fish Feeder Robot for the Philippine Aquaculture Industry: Feeding
Mechanism and Float Design Module. Mindanao Journal of Science and Technology, 15, 89-102.

Fedorko, G., Molnar, V., Marasova, D., Grincova, A., Dovica, M., Zivcak, J., . . . Husakova, N. (2013,
September). Failure analysis of belt conveyor damage caused by the falling material. Engineering Failure
Analysis, 36, 30-38.

Ilanković, N., Živanić, D., Zelić, A., & Savić, S. (2019). Case Study – Safety Measures Applied On Bucket
Elevators. Politehnika.

Jaiswal, A. (2015). Design And Model Of Bucket Elevator. Haryana.

Oo, M. T., & Win, M. M. (2019). Analysis of Belt Bucket Elevator. International Journal of Trend in Scientific
Research and Development (Vol.3, No. 5), 760-763.

Pang, Y. (2010). Intelligent Belt Conveyor Monitoring and Control. Doctoral Thesis, Marine & Transport
Technology.

Patel, S., Patel, S., & Patel, J. (2012). A Review on Design and Analysis of Bucket Elevator. International
Journal of Engineering Research and Applications (IJERA), 2(5), 18-22.

Shinde, S. T., Dixit, V. S., Nukulwar, M. R., & Pimpale, S. S. (2016, April 6). Material optimization and
Modal Analysis of Elevator bucket. International Journal of Current Engineering and Technology, 6(2).
Retrieved from https://inpressco.com/wp-content/uploads/2016/04/Paper33574-580.pdf

Wheat, J. (2007). Hazard Monitoring Equipment. 4B Components Ltd.

Aypa, S.M., (1995). Aquaculture in the Philippines. In: Bagarinao T.U., Flores, E.E.C. (Eds.), Towards
Sustainable Aquaculture in Southeast Asia and Japan: Proceedings of the Seminar-Workshop on
Aquaculture Development in Southeast Asia, Iloilo City, Philippines, 137-147.

Ayub, M. A., Kushairi, S., and Latif, A.A., (2015). A new mobile robotic system for intensive aquaculture
industries., Journal of Applied Science and Agriculture., 10(8), 1-7.
High Availability and Disaster Recovery (2006)
Otto, K.N. and Antonsson, E. K., (1991). Trade-off strategies in engineering design. Research in
Engineering Design, volume 3, number 2, pages 87-104 . Retrieve from
http://www.design.caltech.edu/Research/Publications/90e.pdf on March 11, 2013

https://www.baslerweb.com/en/vision-campus/vision-systems-and-components/image-processing-systems/
https://www.mathscinotes.com/2016/04/mtbf-failure-rate-and-annualized-failure-rate-again/?
fbclid=IwAR0z09_dUB9MomLh8o1fEbGglzocJonK8_LkHMW5-gOlaS44L4kq8l3uTtA
https://www.bb-elec.com/Learning-Center/All-White-Papers/Fiber/MTBF,-MTTR,-MTTF,-FIT-Explanation-
of-Terms/MTBF-MTTR-MTTF-FIT-10262012-pdf.pdf
https://www.zdnet.com/article/mtbf-afr-and-you/
https://www.iso.org/obp/ui/#iso:std:iso-iec:25010:ed-1:v1:en
https://standards.ieee.org/standard/2700-2014.html
https://www.iso.org/standard/61882.html
https://www.iso.org/standard/72908.html
https://www.iso.org/standard/14915.html
https://www.iso.org/obp/ui/#iso:std:iso:18251:-1:ed-1:v1:en
https://standards.ieee.org/standard/1633-2016.html
APPENDICES

Appendix A. Images of Simulated Bucket Elevator in Solidworks (Scaled used in Design Option 3)

Figure Appendix-1 Simulated Bucket Elevator


In the simulated design of the bucket elevator, the red bucket acts as an indicator for the video processing.
Appendix B. Algorithm Testing

Design Option 1
Response Time
Response Time=Time processed−Time detected

Trial Average Response Time

1 1.5000 secs

2 1.3928 secs

3 1.3226 secs

4 1.4483 secs

5 1.4828 secs

6 1.2903 secs

7 1.3571 secs

8 1.3448 secs

9 1.9000 secs

10 1.3571 secs

11 1.2857 secs

12 1.4643 secs
13 1.6552 secs

14 1.5357 secs

15 1.4828 secs

16 1.5000 secs

17 1.2593 secs

18 1.2500 secs

19 1.2414 secs

20 1.5000 secs

21 1.5000 secs

22 1.5000 secs

23 1.3103 secs

24 1.3571 secs

AVE. 1.4266 secs

Reliability
Failed Attempt
Failures ∈Time ( FIT )=
Total Operating Time
5
Failures ∈Time ( FIT )=
24

Failures∈Time ( FIT )=0.2083


1
MTBF=
Failures ∈Time ( FIT )
1 1
MTBF= =
FIT 0.2083
MTBF=4.8

−Life Expectancy
AFR ( % )= 1−e( MBTF )
−Life Expectancy
(
AFR ( % )= 1−e MBTF ) x 100
AFR ( % )=64.71 %

Design Option 2
Response Time=Time processed−Time detected

Trial Average Response Time

1 1.4299 secs

2 1.4251 secs

3 1.4245 secs

4 1.4663 secs

5 1.3905 secs

6 1.3962 secs

7 1.4178 secs

8 1.3972 secs
9 1.4299 secs

10 1.4626 secs

11 1.4211 secs

12 1.4360 secs

13 1.4279 secs

14 1.4764 secs

15 1.4725 secs

16 1.4651 secs

17 1.4306 secs

18 1.4252 secs

19 1.4138 secs

20 1.4186 secs

21 1.4028 secs

22 1.4245 secs

23 1.4198 secs

24 1.5046 secs

AVE. 1.4325 secs


Reliability
Failed Attempt
Failures ∈Time ( FIT )=
Total Operating Time
4
Failures ∈Time ( FIT )=
24

Failures∈Time ( FIT )=0.1667

1
MTBF=
Failures ∈Time ( FIT )
1 1
MTBF= =
FIT 0.1667
MTBF=6

−Life Expectancy
AFR ( % )=(1−e MBTF )
−Life Expectancy
(
AFR ( % )= 1−e MBTF ) x 100
AFR ( % )=56.54 %

Design Option 3
Response Time=Time processed−Time detected

Trial Average Response Time

1 0.0051 secs

2 0.0060 secs

3 0.0060 secs

4 0.0090 secs

5 0.0050 secs
6 0.0060 secs

7 0.0060 secs

8 0.0070 secs

9 0.0058 secs

10 0.0060 secs

11 0.0070 secs

12 0.0060 secs

13 0.0060 secs

14 0.0010 secs

15 0.0080 secs

16 0.0090 secs

17 0.0080 secs

18 0.0080 secs

19 0.0070 secs

20 0.0090 secs

21 0.0000 secs

22 0.0090 secs
23 0.0070 secs

24 0.0080 secs

AVE. 0.0072 secs

Reliability
Failed Attempt
Failures ∈Time ( FIT )=
Total Operating Time
1
Failures ∈Time ( FIT )=
24

Failures∈Time ( FIT )=0.0417

1
MTBF=
Failures∈Time ( FIT )
1 1
MTBF= =
FIT 0.0147
MTBF=23.98

−Life Expectancy
AFR ( % )=(1−e MBTF )
−Life Expectancy
AFR ( % )=(1−e MBTF ) x 100
AFR ( % )=18.82 %
Appendix C. Sensitivity Analysis

 Table 4.1 Design Criteria using Sensitivity Analysis 

% Weight
Efficiency
5
%Weight = x 100=41.67 %
12
Performance
4
% Weight = x 100=33.33 %
12

Economic Scale
3
% Weight = x 100=25.00 %
12
Efficiency (Criterion’s Importance = 5)
Design 1

Design Ranking Value= 1−


{[ ( 3−1.4226
3 )] x 10−5}=0.258
5
Ranking= x 0.258=0.1075
12
Design 2

Design Ranking Value= 1−


{[ ( 3−1.4325
3 )] }
x 10−5 =0.225
5
Ranking= x 0.225=0.0938
12

Design 3

Design Ranking Value= 1−


{[ ( 3−0.0072
3 )] x 10−5}=−4.976
5
Ranking= x −4.976=−2.0733
12
Performance (Criterion’s Importance = 4)
Design 1

Design Ranking Value= 1−


{[ ( 64.71−18.82
64.71 )] x 10−5}=−2.0916
5
Ranking= x −2.0916=−0.6972
12
Design 2

Design Ranking Value= 1−


{[ ( 56.54−18.82
56.54 )] x 10−5}=−1.6713
4
Ranking= x −1.6713=−0.5571
12

Design 3

Design Ranking Value= 1−


{[ ( 18.82−18.82
18.82 )]
x 10−5 =5
}
4
Ranking= x 5=1.6667
12

Economic Scale (Criterion’s Importance = 3)


Design 1

Design Ranking Value=5− 1−


{[ ( 18.82−18.82
18.82 )] x 10}=−2.541
3
Ranking= x −2.541=−0.6353
12

Design 2

{[
Design Ranking Value=5− 1− ( 50,000−18,985.24
50,000 )] x 10}=1.203
3
Ranking= x 1.203=0.3008
12
Design 3

{[
Design Ranking Value=5− 1− ( 50,000−21,105.24
50,000 )] x 10}=0.779
3
Ranking= x 0.779=0.1948
12
Table 4.2 Design Criteria using Sensitivity Analysis with Efficiency criterion having the highest level of
importance value

% Weight
Efficiency
5
%Weight = ×100=55.56 %
9
Performance
2
%Weight = ×100=22.22%
9

Economic Scale
2
%Weight = ×100=22.22%
9

Efficiency (Criterion’s Importance = 5)


Design 1

Design Ranking Value= 1−


{[ ( 3−1.4226
3 )]× 10−5}=0.258
5
Ranking= × 0.258=0.1433
9

Design 2

Design Ranking Value= 1−


{[ ( 3−1.4325
3 )] × 10−5}=0.225
5
Ranking= × 0.225=0.125
9
Design 3

3−0.0072
Design Ranking Value={ 1−
[ ( 3 )]
×10−5 }=−4.976

5
Ranking= ×−4.976=−2.7644
9
Performance (Criterion’s Importance = 2)
Design 1

Design Ranking Value= 1−


{[ ( 64.71−18.82
64.71 )] }
×10−5 =−2.0916

2
Ranking= ×−2.0916=−0.4488
9

Design 2

Design Ranking Value= 1−


{[ ( 56.54−18.82
56.54 )]× 10−5}=−1.6713
2
Ranking= ×−1.6713=−0.3714
9
Design 3

Design Ranking Value= 1−


{[ ( 18.82−18.82
18.82 )] ×10−5}=5
2
Ranking= ×5=1.1111
9

Economic Scale (Criterion’s Importance = 2)


Design 1

Design Ranking Value=5− 1−


{[ ( 50,000−37,705.24
50,000 )] ×10 }=−2.541
2
Ranking= ×−2.541=−0.5647
9

Design 2

Design Ranking Value=5− 1−


{[ ( 50,000−18,985.24
50,000 )] ×10}=1.203
2
Ranking= ×1.203=0.2673
9

Design 3

Design Ranking Value=5− 1−


{[ ( 50,000−21,105.24
50,000 )] × 10−5}=0.779
2
Ranking= × 0.779=0.1731
9
Table 4.3 Design Criteria using Sensitivity Analysis with Performance criterion having the highest level of
importance value

% Weight
Efficiency
2
%Weight = ×100=22.22%
9

Performance
5
%Weight = ×100=55.56 %
9
Economic Scale
2
%Weight = ×100=22.22%
9

Efficiency (Criterion’s Importance = 2)


Design 1

Design Ranking Value= 1−


{[ ( 3−1.4226
3 )]× 10−5}=0.258
2
Ranking= × 0.258=0.0573
9
Design 2

Design Ranking Value= 1−


{[ ( 3−1.4325
3 )] × 10−5}=0.225
2
Ranking= × 0.225=0.05
9

Design 3

Design Ranking Value={ 1−


[ ( 3−0.0072
3 )] ×10−5 }=−4.976
2
Ranking= ×−4.976=−1.1058
9

Performance (Criterion’s Importance = 5)


Design 1

Design Ranking Value= 1−


{[ ( 64.71−18.82
64.71 )] ×10−5}=−2.0916
5
Ranking= ×−2.0916=−1.162
9
Design 2

Design Ranking Value= 1−


{[ ( 56.54−18.82
56.54 )]× 10−5}=−1.6713
5
Ranking= ×−1.6713=−0.9285
9

Design 3

Design Ranking Value= 1−


{[ ( 18.82−18.82
18.82)] }
×10−5 =5

5
Ranking= ×5=2.7778
9

Economic Scale (Criterion’s Importance = 2)


Design 1

Design Ranking Value=5− 1−


{[ ( 50,000−37,705.24
50,000 )] ×10 }=−2.541
2
Ranking= ×−2.541=−0.5647
9

Design 2

Design Ranking Value=5− 1−


{[ ( 50,000−18,985.24
50,000 )] ×10}=1.203
2
Ranking= ×1.203=0.2673
9

Design 3

Design Ranking Value=5− 1−


{[ ( 50,000−21,105.24
50,000 )]
× 10−5 =0.779
}
2
Ranking= × 0.779=0.1731
9

Table 4.4 Design Criteria using Sensitivity Analysis with Economic Scale criterion with highest level of
importance value

% Weight
Efficiency
2
%Weight = ×100=22.22%
9

Performance
2
%Weight = ×100=22.22%
9
Economic Scale
5
%Weight = ×100=55.56 %
9

Efficiency (Criterion’s Importance = 2)


Design 1

Design Ranking Value= 1−


{[ ( 3−1.4226
3 )]× 10−5}=0.258
2
Ranking= × 0.258=0.0573
9
Design 2

Design Ranking Value= 1−


{[ ( 3−1.4325
3 )] × 10−5}=0.225
2
Ranking= × 0.225=0.05
9

Design 3

3−0.0072
Design Ranking Value={ 1−
[ ( 3 )]
×10−5 }=−4.976

2
Ranking= ×−4.976=−1.1058
9

Performance (Criterion’s Importance = 2)


Design 1

Design Ranking Value= 1−


{[ ( 64.71−18.82
64.71 )] ×10−5}=−2.0916
2
Ranking= ×−2.0916=−0.4488
9

Design 2

Design Ranking Value= 1−


{[ ( 56.54−18.82
56.54 )] }
× 10−5 =−1.6713

2
Ranking= ×−1.6713=−0.3714
9
Design 3

Design Ranking Value= 1−


{[ ( 18.82−18.82
18.82 )] ×10−5}=5
2
Ranking= ×5=1.1111
9

Economic Scale (Criterion’s Importance = 5)


Design 1

Design Ranking Value=5− 1−


{[ ( 50,000−37,705.24
50,000 )] ×10 }=−2.541
5
Ranking= ×−2.541=−1.4117
9

Design 2

Design Ranking Value=5− 1−


{[ ( 50,000−18,985.24
50,000 )]
×10 =1.203
}
5
Ranking= ×1.203=0.6683
9
Design 3

Design Ranking Value=5− 1−


{[ ( 50,000−21,105.24
50,000 )] × 10−5}=0.779
5
Ranking= × 0.779=0.4328
9

Table 4.5 Design Criteria using Sensitivity Analysis with all criteria having the highest level of importance
value

% Weight
Efficiency
5
%Weight = × 100=33.33 %
15

Performance
5
%Weight = × 100=33.33 %
15

Economic Scale
5
%Weight = × 100=33.33 %
15

Efficiency (Criterion’s Importance = 5)


Design 1

Design Ranking Value= 1−


{[ ( 3−1.4226
3 )]× 10−5}=0.258
5
Ranking= ×0.258=0.086
15

Design 2

Design Ranking Value= 1−


{[ ( 3−1.4325
3 )] × 10−5}=0.225
5
Ranking= ×0.225=0.075
15
Design 3

3−0.0072
Design Ranking Value={ 1−
[ ( 3 )]
×10−5 }=−4.976

5
Ranking= ×−4.976=−1.6587
15
Performance (Criterion’s Importance = 5)
Design 1

Design Ranking Value= 1−


{[ ( 64.71−18.82
64.71 )] ×10−5}=−2.0916
5
Ranking= ×−2.0916=−0.6972
15
Design 2
Design Ranking Value= 1−
{[ ( 56.54−18.82
56.54 )]× 10−5}=−1.6713
5
Ranking= ×−1.6713=−0.5571
15

Design 3

Design Ranking Value= 1−


{[ ( 18.82−18.82
18.82 )] ×10−5}=5
5
Ranking= ×5=1.6667
15

Economic Scale (Criterion’s Importance = 5)


Design 1

Design Ranking Value=5− 1−


{[ ( 50,000−37,705.24
50,000 )] }
×10 =−2.541

5
Ranking= ×−2.541=−0.847
15

Design 2

Design Ranking Value=5− 1−


{[ ( 50,000−18,985.24
50,000 )] ×10}=1.203
5
Ranking= ×1.203=0.401
15
Design 3

Design Ranking Value=5− 1−


{[ ( 50,000−21,105.24
50,000 )] × 10−5}=0.779
5
Ranking= ×0.779=0.2597
15
Table 4.6 Design Criteria using Sensitivity Analysis with Efficiency criterion having with highest level of
importance value

% Weight
Efficiency
5
%Weight = ×100=55.56 %
9
Performance
3
%Weight = ×100=33.33 %
9

Economic Scale
1
%Weight = ×100=11.11%
9

Efficiency (Criterion’s Importance = 5)


Design 1

Design Ranking Value= 1−


{[ ( 3−1.4226
3 )]× 10−5}=0.258
5
Ranking= × 0.258=0.1433
9
Design 2

Design Ranking Value= 1−


{[ ( 3−1.4325
3 )] × 10−5}=0.225
5
Ranking= × 0.225=0.125
9

Design 3

3−0.0072
Design Ranking Value={ 1−
[ ( 3 )]
×10−5 }=−4.976

5
Ranking= ×−4.976=−2.7644
9
Performance (Criterion’s Importance = 3)
Design 1

Design Ranking Value= 1−


{[ ( 64.71−18.82
64.71 )] ×10−5}=−2.0916
3
Ranking= ×−2.0916=−0.6972
9

Design 2

Design Ranking Value= 1−


{[ ( 56.54−18.82
56.54 )]× 10−5}=−1.6713
3
Ranking= ×−1.6713=−0.5571
9
Design 3

Design Ranking Value= 1−


{[ ( 18.82−18.82
18.82)] }
×10−5 =5

3
Ranking= ×5=1.6667
9

Economic Scale (Criterion’s Importance = 1)


Design 1

Design Ranking Value=5− 1−


{[ ( 50,000−37,705.24
50,000 )] ×10 }=−2.541
1
Ranking= ×−2.541=−0.2823
9

Design 2

Design Ranking Value=5− 1−


{[ ( 50,000−18,985.24
50,000 )] ×10}=1.203
1
Ranking= ×1.203=0.1337
9
Design 3

Design Ranking Value=5− 1−


{[ ( 50,000−21,105.24
50,000 )] × 10−5}=0.779
1
Ranking= × 0.779=0.0866
9

Table 4.7 Design Criteria using Sensitivity Analysis with Performance criterion with highest level of
importance value

% Weight
Efficiency
1
%Weight = ×100=11.11%
9
Performance
5
%Weight = ×100=55.56 %
9

Economic Scale
3
%Weight = ×100=33.33 %
9

Efficiency (Criterion’s Importance = 1)


Design 1

Design Ranking Value= 1−


{[ ( 3−1.4226
3 )] }
× 10−5 =0.258

1
Ranking= × 0.258=0.0287
9
Design 2
Design Ranking Value= 1−
{[ ( 3−1.4325
3 )] × 10−5}=0.225
1
Ranking= × 0.225=0.025
9

Design 3

Design Ranking Value={ 1−


[ ( 3−0.0072
3 )] ×10−5 }=−4.976
1
R anking= ×−4.976=−0.5529
9

Performance (Criterion’s Importance = 5)


Design 1

Design Ranking Value= 1−


{[ ( 64.71−18.82
64.71 )] ×10−5}=−2.0916
5
Ranking= ×−2.0916=−1.162
9

Design 2

Design Ranking Value= 1−


{[ ( 56.54−18.82
56.54 )]× 10−5}=−1.6713
5
Ranking= ×−1.6713=−0.9285
9
Design 3

Design Ranking Value= 1−


{[ ( 18.82−18.82
18.82)] }
×10−5 =5

5
Ranking= ×5=2.7778
9

Economic Scale (Criterion’s Importance = 3)


Design 1

Design Ranking Value=5− 1−


{[ ( 50,000−37,705.24
50,000 )] ×10 }=−2.541
3
Ranking= ×−2.541=−0.847
9

Design 2

Design Ranking Value=5− 1−


{[ ( 50,000−18,985.24
50,000 )] ×10}=1.203
3
Ranking= ×1.203=0.401
9

Design 3

Design Ranking Value=5− 1−


{[ ( 50,000−21,105.24
50,000 )] × 10−5}=0.779
3
Ranking= × 0.779=0.2597
9

Table 4.8 Design Criteria using Sensitivity Analysis with Economic Scale criterion having highest level of
importance value

% Weight
Efficiency
3
%Weight = ×100=33.33 %
9
Performance
1
%Weight = ×100=11.11%
9

Economic Scale
5
%Weight = ×100=55.56 %
9
Efficiency (Criterion’s Importance = 3)
Design 1
Design Ranking Value= 1−
{[ ( 3−1.4226
3 )]× 10−5}=0.258
3
Ranking= × 0.258=0.086
9

Design 2

Design Ranking Value= 1−


{[ ( 3−1.4325
3 )] × 10−5}=0.225
3
Ranking= × 0.225=0.075
9
Design 3

3−0.0072
Design Ranking Value={ 1−
[ ( 3 )]
×10−5 }=−4.976

3
Ranking= ×−4.976=−1.6587
9
Performance (Criterion’s Importance = 1)
Design 1

Design Ranking Value= 1−


{[ ( 64.71−18.82
64.71 )] ×10−5}=−2.0916
1
Ranking= ×−2.0916=−0.2324
9
Design 2

Design Ranking Value= 1−


{[ ( 56.54−18.82
56.54 )] }
× 10−5 =−1.6713

1
Ranking= ×−1.6713=−0.1857
9

Design 3

Design Ranking Value= 1−


{[ ( 18.82−18.82
18.82 )] ×10−5}=5
1
Ranking= ×5=0.5556
9
Economic Scale (Criterion’s Importance = 5)
Design 1

Design Ranking Value=5− 1−


{[ ( 50,000−37,705.24
50,000 )] ×10 }=−2.541
5
Ranking= ×−2.541=−1.4117
9

Design 2

Design Ranking Value=5− 1−


{[ ( 50,000−18,985.24
50,000 )] ×10}=1.203
5
Ranking= ×1.203=0.6683
9

Design 3

Design Ranking Value=5− 1−


{[ ( 50,000−21,105.24
50,000 )]
× 10−5 =0.779
}
5
Ranking= × 0.779=0.4328
9

Table 4.9 Design Criteria using Sensitivity Analysis with all criteria having the highest level of importance
value

% Weight
Efficiency
3
%Weight = ×100=33.33 %
9
Performance
3
%Weight = ×100=33.33 %
9

Economic Scale
3
%Weight = ×100=33.33 %
9

Efficiency (Criterion’s Importance = 3)


Design 1

Design Ranking Value= 1−


{[ ( 3−1.4226
3 )] }
× 10−5 =0.258

3
Ranking= × 0.258=0.086
9

Design 2

Design Ranking Value= 1−


{[ ( 3−1.4325
3 )] × 10−5}=0.225
3
Ranking= × 0.225=0.075
9
Design 3

3−0.0072
Design Ranking Value={ 1−
[ ( 3 )]
×10−5 }=−4.976

3
Ranking= ×−4.976=−1.6587
9
Performance (Criterion’s Importance = 3)
Design 1

Design Ranking Value= 1−


{[ ( 64.71−18.82
64.71 )] ×10−5}=−2.0916
3
Ranking= ×−2.0916=−0.6972
9
Design 2

Design Ranking Value= 1−


{[ ( 56.54−18.82
56.54 )]× 10−5}=−1.6713
3
Ranking= ×−1.6713=−0.5571
9

Design 3

Design Ranking Value= 1−


{[ ( 18.82−18.82
18.82 )] ×10−5}=5
3
Ranking= ×5=1.6667
9

Economic Scale (Criterion’s Importance = 3)


Design 1

Design Ranking Value=5− 1−


{[ ( 50,000−37,705.24
50,000 )] }
×10 =−2.541

3
Ranking= ×−2.541=−0.847
9

Design 2

Design Ranking Value=5− 1−


{[ ( 50,000−18,985.24
50,000 )] ×10}=1.203
3
Ranking= ×1.203=0.401
9

Design 3

Design Ranking Value=5− 1−


{[ ( 50,000−21,105.24
50,000 )] × 10−5}=0.779
3
Ranking= × 0.779=0.2597
9

Table 4.10 Design Criteria using Sensitivity Analysis with Efficiency criterion with highest level of
importance value

% Weight
Efficiency
4
%Weight = × 100=57.14 %
7
Performance
2
%Weight = ×100=28.57 %
7

Economic Scale
1
%Weight = ×100=14.29%
7

Efficiency (Criterion’s Importance = 4)


Design 1

Design Ranking Value= 1−


{[ ( 3−1.4226
3 )]× 10−5}=0.258
4
Ranking= ×0.258=0.1474
7
Design 2

Design Ranking Value= 1−


{[ ( 3−1.4325
3 )] }
× 10−5 =0.225

4
Ranking= ×0.225=0.1286
7

Design 3

3−0.0072
Design Ranking Value={ 1−
[ ( 3 )]
×10−5 }=−4.976

4
Ranking= ×−4.976=−2.8434
7

Performance (Criterion’s Importance = 2)


Design 1
Design Ranking Value= 1−
{[ ( 64.71−18.82
64.71 )] ×10−5}=−2.0916
2
Ranking= ×−2.0916=−0.5976
7

Design 2

Design Ranking Value= 1−


{[ ( 56.54−18.82
56.54 )]× 10−5}=−1.6713
2
Ranking= ×−1.6713=−0.4775
7

Design 3

Design Ranking Value= 1−


{[ ( 18.82−18.82
18.82)] }
×10−5 =5

2
Ranking= ×5=1.4286
7
Economic Scale (Criterion’s Importance = 1)
Design 1

Design Ranking Value=5− 1−


{[ ( 50,000−37,705.24
50,000 )] ×10 }=−2.541
1
Ranking= ×−2.541=−0.368
7

Design 2

Design Ranking Value=5− 1−


{[ ( 50,000−18,985.24
50,000 )] ×10}=1.203
1
Ranking= ×1.203=0.1719
7
Design 3
Design Ranking Value=5− 1−
{[ ( 50,000−21,105.24
50,000 )] × 10−5}=0.779
1
Ranking= ×0.779=0.1113
7

Table 4.11 Design Criteria using Sensitivity Analysis with Performance criterion with highest level of
importance value

% Weight
Efficiency
1
%Weight = ×100=14.29%
7

Performance
4
%Weight = × 100=57.14 %
7

Economic Scale
2
%Weight = ×100=28.57 %
7

Efficiency (Criterion’s Importance = 1)


Design 1

Design Ranking Value= 1−


{[ ( 3−1.4226
3 )]× 10−5}=0.258
1
Ranking= ×0.258=0.0369
7
Design 2

Design Ranking Value= 1−


{[ ( 3−1.4325
3 )] }
× 10−5 =0.225
1
Ranking= ×0.225=0.0321
7

Design 3

Design Ranking Value={ 1−


[ ( 3−0.0072
3 )] ×10−5 }=−4.976
1
Ranking= ×−4.976=−0.7109
7

Performance (Criterion’s Importance = 4)


Design 1

Design Ranking Value= 1−


{[ ( 64.71−18.82
64.71 )] ×10−5}=−2.0916
4
Ranking= ×−2.0916=−1.1952
7

Design 2

Design Ranking Value= 1−


{[ ( 56.54−18.82
56.54 )]× 10−5}=−1.6713
4
Ranking= ×−1.6713=−0.955
7
Design 3

Design Ranking Value= 1−


{[ ( 18.82−18.82
18.82)] }
×10−5 =5

4
Ranking= ×5=2.8571
7

Economic Scale (Criterion’s Importance = 2)


Design 1
Design Ranking Value=5− 1−
{[ ( 50,000−37,705.24
50,000 )] ×10 }=−2.541
2
Ranking= ×−2.541=−0.726
7

Design 2

Design Ranking Value=5− 1−


{[ ( 50,000−18,985.24
50,000 )] ×10}=1.203
2
Ranking= ×1.203=0.3437
7
Design 3

Design Ranking Value=5− 1−


{[ ( 50,000−21,105.24
50,000 )] × 10−5}=0.779
2
Ranking= ×0.779=0.2226
7

Table 4.12 Design Criteria using Sensitivity Analysis with Economic Scale criterion with highest level of
importance value

% Weight
Efficiency
2
%Weight = ×100=28.57 %
7
Performance
1
%Weight = ×100=14.29%
7

Economic Scale
4
%Weight = × 100=57.14 %
7

Efficiency (Criterion’s Importance = 2)


Design 1

Design Ranking Value= 1−


{[ ( 3−1.4226
3 )]× 10−5}=0.258
2
Ranking= ×0.258=0.0737
7
Design 2

Design Ranking Value= 1−


{[ ( 3−1.4325
3 )] }
× 10−5 =0.225

2
Ranking= ×0.225=0.0643
7

Design 3

Design Ranking Value= 1−


{[ ( 3−1.4325
3 )] × 10−5}=−4.976
2
Ranking= ×−4.976=−1.4217
7

Performance (Criterion’s Importance = 1)


Design 1

Design Ranking Value= 1−


{[ ( 64.71−18.82
64.71 )] ×10−5}=−2.0916
1
Ranking= ×−2.0916=−0.2988
7
Design 2
{[
Design Ranking Value= 1− ( 56.54−18.82
56.54 )]× 10−5}=−1.6713
1
Ranking= ×−1.6713=−0.2388
7

Design 3

Design Ranking Value= 1−


{[ ( 18.82−18.82
18.82 )] ×10−5}=5
1
Ranking= ×5=0.7143
7
Economic Scale (Criterion’s Importance = 4)
Design 1

Design Ranking Value=5− 1−


{[ ( 50,000−37,705.24
50,000 )] }
×10 =−2.541

4
Ranking= ×−2.541=−1.452
7
Design 2

Design Ranking Value=5− 1−


{[ ( 50,000−18,985.24
50,000 )] ×10}=1.203
4
Ranking= ×1.203=0.6874
7
Design 3

Design Ranking Value=5− 1−


{[ ( 50,000−21,105.24
50,000 )] }
× 10−5 =0.779

4
Ranking= ×0.779=0.4451
7

Appendix D. Datasheet

Design Option 1
Design Option 2
Design Option 3

You might also like