Professional Documents
Culture Documents
1
Software Engineering Project Final Report
Blue Team
1
The Pennsylvania State University, University Park, PA 16802
This report serves as the culiminating experience for the software engineering project in
AERP 440 Introduction to Software Engineering for Aerospace Engineers. Information
about requirements, design, coding, testing and verification and validation are presented
and analyzed. Various reports or key data points are presented, such as UML sequence
diagrams and testing reports. The Blue Team found this project to be benefial to their
learning in the course.
I. Introduction
HIS semester in AERSP 440 Introduction to Software Engineering for Aerospace Engineers, the class has
experienced two parallel activities. The first of these activities are traditional instructional methods, namely
lectures, homework and examinations. The later of these activities consisted of a software engineering project;
contained here is the final report for that project. Specifically, we were directed by Dr. Long that:
The project for this course will be the development of a software/hardware system using a small three-
wheeled robot chassis and an onboard Arduino processor, sonar sensor, and wifi camera. Each team will
build a robot that will:
Controlled thru a user interface on a laptop
The person controlling the robot will not be in the same room as the competition
Each robot will have a webcam on it and the person controlling it will see the image from the
onboard camera
Try to find and shoot the other robot using infrared sensor
Try to evade the other robot
Which ever team shoots the other teams robot the most wins
In order to undertake this task, we needed a laptop and a C or C++compiler. Additionally, Dr. Long provided:
A robot platform
An Arduino processor
An Arduino WiFi board
An Arduino Motor control board
A wifi camera (Foscam FI8909W)
A Infrared transmitter and receivers
Batteries
The team noted early on that all of the above provided materials were to be returned at the end of the semester, or
that the team would have to reimburse Dr. Long for the cost of them. Our team elected Brad Sottile as its Chief
Executive Officer (CEO), Tom Gempp as its Chief Financial Officer and Brian Harrell as its Chief Information
Officer (CIO).
1
Brad Sottile, CEO, is currently a graduate student in aerospace engineering. The other members of the team are
undergraduate students in aerospace engineering or engineering science. Various members of the team are student
members of AIAA and/or IEEE.
T
Department of Aerospace Engineering
2
II. Chiefs
A. Chief Executive Officer (CEO)
Brad Sottile is the Chief Executive Officer (CEO) for the Blue Team. The role of the CEO is to interact with the
other groups to maintain the projects costs and time time schedule. The CEO has also been the main point of
contact for the customer. Every week, Brad has tried to stay in touch with Dr. Long, the TA, and the other groups in
order to mentor the groups and help to troubleshoot problems. The CEO has also action as the Blue Teams
designated liaison to the CEO of the White Team, Tim Double. For the saking of space, a detailed Gannt chart
reflecting the Teams progress can be found attached to this report. All in all, some tasks where completed ahead of
schedule, some tasks ran late, and some tasks were completed right on time. The Blue Team found trying to work
ahead to be beneficial, since it enabled us to have a little more time to fall back on for when we did have schedule
slips. Overall, the CEO is pleased that this project was delivered on time and under budget, a rarity for many
software engineering projects.
B. Chief Financial Officer (CFO)
Tom Gempp is the Chief Financial Officer (CFO) for the Blue Team. The CFO is responsible for the financial
planning and record-keeping, as well as financial reporting to higher management. Typically, the CFO reports
directly to the CEO and assists the Chief Operating Officer on all strategic and tactical matters as they relate to
budget management, cost benefit analysis, forecasting needs and the securing of new funding. Throughout the
course of this development, the CFO reported directly to the CEO, and updated the teams and client regularly.
Overall, the project was planned to have cost $326,000.00; the final cost of the project is $206.150.00,
approximately 45% less than what was expected.
1. Constructive Cost Model (COCOMO)
The Constructive Cost Model (COCOMO), an empirical model based on project experience, is a well-
documented, publically available model which acts independently from a specific software vendor. There are three
distinct level of project complexity that this model is able to represent: simple, moderate, and embedded. This
project fell into the simple complexity category in which the following model can be used:
PH = 2.4(KSI)
1.05
H (1)
where PM is person months, KDSI is the thousands of delivered source instructions, and M is the product, project,
and team characteristics; all rated on a scale from 1 to 6. The variable M is defined as:
H = PERS RCPX RuSE PIF PREX FCII SCE (2)
where PERS is the personnel capability, RCPX is the product reliability and complexity, RUSE is the reuse
requirement, PDIF is the platform difficulty, PREX is the personnel experience, FCIL is the team support facilities,
and SCED is the required schedule. The following values were used for each:
PERS 3 PREX 2
RCPX 2 FCIL 1
RUSE 1 SCED 2
PDIF 2
With 1,000 lines of code written and using the COCOMO equation above, the expected time for this project was:
115.2 person months, which for a team of 35 people works out to be 3.29 months/person. This appears high
considering no one was able to work on this project more than part time, however, this overprojection may be a
function of our inexperience in software cost estimation.
2. Initial Cost Estimation
The methodology used in conducting the initial cost estimation was the bottom-up approach. This method starts
at the component level and estimates the effort required for each component; then these values are added to reach a
final estimate. Table 1 shows the breakdown each of the Chiefs and team leads submitted for estimation.
Department of Aerospace Engineering
3
Table 1. Initial Blue Team Cost Re-Evaluation
Midway through the project, a voluntary reassessment was conducted so that the Chiefs and team leads could have a
better understanding of the amount of remaining work to complete our mission objective. The following table
shows the breakdown of the Chiefs and team leads estimation for the second half of the project.
Table 2. Midsemester Blue Team Cost Re-Evaluation
3. Operational Target (OPTAR)
Using the operational target (OPTAR) system used predominately at the United States Naval War College, the
budget was able to monitored more efficiently. This required team leads to submit weekly hours reports for their
groups, which were then complied into a series of interconnected documents. Once the data was loaded into the
system, the OPTAR was updated instantaneously. The complete Blue Team OPTAR is shown in Figure 1.
Figure 1. Blue Team OPTAR
Group Hours/Person TeamMembers TotalManHours(hrs) HourlyRate($/hr) TotalCost $/wk(15wks)
CEO 180 1 180 400.00 $ 72,000.00 $ 4,800.00 $
CFO 160 1 160 400.00 $ 64,000.00 $ 4,266.67 $
CIO 160 1 160 400.00 $ 64,000.00 $ 4,266.67 $
Requirements 10 6 60 200.00 $ 12,000.00 $ 800.00 $
Design 15 6 90 200.00 $ 18,000.00 $ 1,200.00 $
Coding 42 7 294 200.00 $ 58,800.00 $ 3,920.00 $
Testing 17 6 102 200.00 $ 20,400.00 $ 1,360.00 $
V&V 12 7 84 200.00 $ 16,800.00 $ 1,120.00 $
Total 596 35 1130 326,000.00 $ 21,733.33 $
Group Hours/Person TeamMembers TotalManHours(hrs) HourlyRate($/hr) TotalCost $/wk(7wks)
CEO 55 1 55 400.00 $ 22,000.00 $ 3,142.86 $
CFO 30 1 30 400.00 $ 12,000.00 $ 1,714.29 $
CIO 30 1 30 400.00 $ 12,000.00 $ 1,714.29 $
Requirements 5 6 27 200.00 $ 5,400.00 $ 771.43 $
Design 5 6 29.5 200.00 $ 5,900.00 $ 842.86 $
Coding 18 7 127.5 200.00 $ 25,500.00 $ 3,642.86 $
Testing 10 6 60 200.00 $ 12,000.00 $ 1,714.29 $
V&V 8 7 54.25 200.00 $ 10,850.00 $ 1,550.00 $
Total 160 35 413.25 105,650.00 $ 15,092.86 $
Department of Aerospace Engineering
4
As can be seen in the preceeding figure, the overall project was significantly under budget. During the first five
weeks of the project, all of the teams were working simultaneously in order to get ahead, mainly by the
Requirements and Design groups. This surge leveled off from weeks five through eight, this can attributed to the
majority of the work being done by mainly the Coding group. There was a steady increase in workflow from week
nine through eleven. There was a significant surge in fund expenditures from week eleven through thirteen as
Testing and V&V began to have a significant increase in workflow; this trend continued up to the completing of the
project.
4. Blue Team Final Cost Breakdown
A complete analysis of the hours and fund expenditures by each of the groups was conducted. The breakdown of
each team is shown in Table 3.
Table 3. End of Semester Blue Team Cost Breakdown
As can be seen by Table 3, the overall project was significantly under budget. This can be attributed to using the
bottom-up method as described above. All of the groups overestimated their number of project hours because of the
unfamiliarity with the objective and tasks. The following table and figures show the distribution of the funds
expended throughout the course of the project.
Table 4. Cost Breakdown by Expense
Group TotalWorkHours HoursEstimated HoursRemaining FundsEst. FundsExp. FundsRemaining
CEO 52 180.0 128.00
72,000.00 $ 20,800.00 $ 51,200.00 $
CFO 46 160.0 114.00
64,000.00 $ 18,400.00 $ 45,600.00 $
CIO 38 160.0 122.00
64,000.00 $ 15,200.00 $ 48,800.00 $
Requi rements 58 60.0 2.00
12,000.00 $ 11,600.00 $ 400.00 $
Desi gn 94 90.0 -4.00
18,000.00 $ 18,800.00 $ (800.00) $
Codi ng 324 294.0 -30.00
58,800.00 $ 64,800.00 $ (6,000.00) $
Testi ng 108 102.0 -6.00
20,400.00 $ 21,600.00 $ (1,200.00) $
V&V 174.75 84.0 -90.75
16,800.00 $ 34,950.00 $ (18,150.00) $
Total 894.75 1130.0 235.25 326,000.00 $ 206,150.00 $ 119,850.00 $
Group TotalWorkHours HourlyRate FundsExp %Hours %Funds
CEO 52 400.00 $ 20,800.00 $ 5.81 10.09
CFO 46 400.00 $ 18,400.00 $ 5.14 8.93
CIO 38 400.00 $ 15,200.00 $ 4.25 7.37
Requirements 58 200.00 $ 11,600.00 $ 6.48 5.63
Design 94 200.00 $ 18,800.00 $ 10.51 9.12
Coding 324 200.00 $ 64,800.00 $ 36.21 31.43
Testing 108 200.00 $ 21,600.00 $ 12.07 10.48
V&V 174.75 200.00 $ 34,950.00 $ 19.53 16.95
Total 894.75 - 206,150.00 $ 100 100
Department of Aerospace Engineering
5
Figure 2. Project Work Hours Distribution Figure 3. Project Fund Expenditure Breakdown
C. Chief Information Officer (CFO)
Brian Harrell is the Chief Information Officer for the Blue Team. The CIO had several responsibilities over the
duration of this project. His primary responsibilities involved setting up a secure website for the blue team, and
monitoring all files and file revisions. He began the semester by setting up a website for the blue team, which was
hosted by Google. The website contains a separate page for each step of the software engineering V-cycle
(Requirements, Design, Coding, Testing, Validation & Verification), as well as separate pages for the chiefs,
finances, Gannt chart and hours reports. Each page of the website contains all of the files and file revisions for each
separate V-cycle group. In addition, the chiefs page contains brief descriptions of each chiefs role and the Finances
page contains information and graphs related to our projected cost for this project as well as reports for each weeks
progress and expenses. Finally, the Gannt chart page contains the original Gannt chart as well as updates to the
chart as they have been provided by the CEO.
On top of maintain the website and monitoring access to the website, the CIO has also spent time backing up and
organizing all of our teams files on his own personal external hard drive. He also spent time discussing and
monitoring the progress of the project at several meetings with the chiefs and group leads. Overall, throughout the
course of this project, he was able to closely monitor access to, and the security of our website, as well as ensuring
that all groups had the necessary files and documents needed to complete the project in a timely manner.
III. Requirements
Ty Druce is the lead of the Requirements group. This portion of the report covers all the aspects leaned about
requirements engineering, including the overall process, CONOPS, requirements elicitation and documentation, and
useful charts. This document focuses on lessons learned for each of these aspects. Requirements engineering is a
crucial part of the systems engineering process and vital to the success of any program.
A. Reflection and Lessons Learned
1. Requirements Documentation
The first step in requirements engineering was requirements elicitation. This is the process of requirements
gathering and required collaboration between the customer, program management, the design team, V&V, and the
user. It was learned that the coding team also played an integral part in determining requirements, since this team
has direct knowledge on the capabilities of the software itself. In future programs, the coding team should be
consulted sooner in the process to save time and money.
In requirements documentation, semantics played a crucial role. Many requirements were rewritten several times
to obtain the most clear and concise language. It is important that requirements clearly convey what the system
should do without overbearing constraints. Finding this balance was difficult and it took a couple weeks of practice
Department of Aerospace Engineering
6
before good requirements could be written easily. Unclear language can result in misunderstandings and deviate the
program from the proper path.
2. Requirements Schematics
Several types of schematics were used to visually represent the requirements and convey a high level picture of
the system. This team utilized system flow charts, UML diagrams, scenarios, and sequence diagrams to help visually
portray the system. Software engineering is uniquely difficult since it is not a tangible product; it is hard to visualize
and track progress. For this reason, it is imperative that visual aids are used to help team members understand the
scope of the system. These visuals also help program management track progress and help estimate hours and cost.
They are also helpful in making sure the customer and the designers are on the same page for the product.
3. Requirements Engineering
One of the leading reasons for software engineering failures is due to poor requirements. Requirements
engineering is the first step in the engineering cycle, but the most critical. Requirements are the cheapest item to fix
at the start of a program, but the most expensive to fix at the end of the program; therefore, it is critical to invest
time and skill into generate solid, clear, and practical requirements. This team learned the requirements process is
iterative in itself. To develop proper language, and to convey system and user capabilities, all take several iterations.
B. CONOPS and Requirements Document
A Concept of Operations (CONOPS) and formal requirements document was approved by the customer and
then utilized by the Architecture and Design team to design the system. This document was also used by the
Verification and Validation team to ensure the proper product was built and all the requirements were met. This
document has been submitted with this report.
C. Traceability Matrix
A traceability matrix was used to describe the relationships between requirements, their sources, and the system
design. It helps to link dependent and reliable requirements; therefore, making it easier to see how a requirements
change propagates to other requirements. Our traceability matrix may be found below in Figure 4.
Figure 4. Traceability Matrix
D. Viewpoint Charts
When developing a system, it is important to consider all actors with will be involved in using the system.
These actors may directly or indirectly act on the system, and they may be tangible or intangible actors. For
Department of Aerospace Engineering
7
example, both the specific user and manager of a system play a separate and unique role in how they will interact
with it. Formal specifications, such as government guidelines, will also play a role in how the system is designed.
Figure 5 shows the viewpoint chart for this program.
Figure 5. Viewpoint Chart
E. Scenarios
Scenarios are used to describe real-life examples of how a system can be used. It makes it easier for people to
understand how the system would react in certain situation. Below is an example scenario created for the program.
1. Initial Assumption
The robot seeks the white teams robot and shoots it with an infrared laser more times than the white team can
hit the blue teams robot. Each teams operator has the control laptop in a separate room and can only view the
robots through a wifi camera.
2. Normal Operation
The blue teams operator maneuvers the robot and quickly finds the white teams robot. Once fount, it will begin
firing and infrared laser at it and hit it as many times as possible. If the white team locks onto the blue teams robot,
the blue robot will evade it immediately and continue to seek and destroy the white robot. The blue team will hit the
white teams robot with the laser more times than white team hits the blue robot.
3. What Can Go Wrong
The white team locates the blue robot first and begins firing upon it before the blue team can find the white robot
or the white teams robot starts shooting the blue robot more times than the blue team is hitting the white robot.
Further, the blue team must immediately perform evasive maneuvers and continue to seek and fire at the white
robot. Finally, if the blue robot hits an obstacle or gets stuck, the operator must immediately proceed around the
object, or reverse out of the situation.
4. Other Activities
The white team may be evading the blue teams lasers remarkably or the blue team keeps missing the white
robot.
5. System State on Conclusion
The blue team hits the white teams robot with the laser more times than the white team hits the blue robot and
the blue team receives three extra credit points. The robot and the control laptop are shut down.
F. Use Cases
Use cases were used to identify the actors in an interaction and describe the interaction itself in the Unified
Markup Language (UML). The following is an example of simple use case describing the task of our current
system. The 3 persons in the diagram are the customer who makes the demands, the user who programs the system
Department of Aerospace Engineering
8
and controls the robot, and the robot itself which performs the action. The customer creates instructions for the user.
The user then edits the instructions for the robot in a language which the robot can understand. The robot then
transmits information back to the user so that the users instructions may be accurate to the current situation of the
robot. The robot also takes the instructions from the user and interprets the commands and performs the task. Our
use case may be found in Figure 6
Figure 6. Use Case
G. Sequence Diagram
A sequence diagram was used to add detail to use-cases by showing sequence of even processing in the system.
Sequence diagram helps understand the interaction between user and a system via series of events that take place for
a given condition. Figure 7 is a general representation of a task described in sequence of events.
Figure 7. Sequence Diagram
This sequence diagram below shows the interactions between the user, the motherboard, and the robot. It is
the responsibility of the team to ensure that the robot gives feedback to the user of whether or not a task can be
completed, when it was completed, and what the user can do to correct a possible error. Here it shows whether the
commands were accepted, whether the laser was fired, and whether the robot has taken damage from the other robot
in competition.
Department of Aerospace Engineering
9
H. Structure Chart
The robotic system that the Blue Team will be using is made up of two main parts. The vehicle system will be in
the room of the competition while the controller or operator will be in another room. As the competition
commences, the camera mounted on the vehicle will continuously send a real time feed through an internet browser
on the operators laptop in the other room. From this separate room the operator will control the vehicle using a
series of commands that will be sent to the processor and board on the vehicle system. These commands will funnel
down from the processor to the IR gun and sensor as well as the vehicle motor. Figure 8 shows the robotic system
breakdown.
Figure 8. High Level Structure Chart
I. Requirements Conclusions
Requirements engineering begins the development process and is one of the most critical aspects of a program.
The difference between good and bad requirements could be the difference between a successful or unsuccessful
program, or a program with runaway costs at the end to fix poor requirements. Requirements elicitation should
utilize the several chart explained in this document to help engineers consider all actors involved in a system. It is
important requirement engineers understand the capabilities and limitations of the systems they are creating so
unfeasible requirements do not plague lower level design engineers. Understanding requirements is necessary for all
engineers.
IV. Design
Evan Masters is the lead of the Design group. The design team first began with the information provided from
the requirements engineering process. Requirements were reviewed, rated, and traced based on understanding and
clarity for the system. Preliminary and architectural designs were created in the form of sequence diagrams,
structure diagrams, state diagrams and data flow models. The key goal of the design team was to create a design
portfolio that could be easily understood and developed by the coding team, and then passed to testing and
validation and verification (V&V). It is of paramount importance that the design team worked quickly and
efficiently to produce a system design so that the coding team could get a quick start on code development. In
addition, the preliminary budget dictated that the coding team would expend the most hours on the project, so the
design team needed to work in an efficient manner to produce preliminary and architectural designs to the team
within the budget guidelines. Once each team progressed through the V-cycle, the design team used an iterative
approach to the architectural design to meet the overall team goals. As requirements were updated, the
corresponding design was tweaked and edited to meet these new requirements. During each iteration this
information was passed on to coding, testing and V&V to ensure that the overall team was making progress and
conforming to each updated design. It is the belief of the design team that a robust and efficient design was created
to give the team the greatest chance of success and therefore completing the objectives set forth in the competition,
thus placing the team in position to win the overall competition.
Department of Aerospace Engineering
10
A. Reflection and Lessons Learned
This passage reviews and reflects on the design engineering portion of the software engineering project. It
focuses on the lessons learned during this process.
1. Design Process
To prepare for the design process, the team familiarized itself with chapter 3 of the Software Engineering Book
of Knowledge (SWEBOK) and section 2.3 of a Gentle Introduction to Software Engineering (GISE). These
publications discussed the need for a robust design and ways in which this can be achieved. The next step in
developing an architectural design for the robot was to obtain the requirements document from the requirements
team. A preliminary design was critical in understanding the link between product specifications from requirements
and detailed interactions between the individual components. This architectural design is important in the
verification and validation of non-functional requirements of the system. For this project, the simplified V-cycle
does not allow for V&V to take place until some testing has begun. In a real software application, this V&V work
would take place at each step in the software development cycle, which contributes to V&V often consuming 50%
or more of the overall budget.
Detailed design of the system and its components revolved around creating diagrams that show the interaction
between system components for each of the functional requirements. These models were created in multiple ways
through sequence, state and data flow models to give the coding team different angles of the same processes to
create a more clear vision of the system they are to develop. A very important portion of software design lies with
giving the coding team enough information from which they can develop a system that not only completes the
intended goal, but also is created in a way that the testing and V&V teams can test and verify the system
requirements in a reasonable manner.
2. Design Models
The models created for the coding teams software development included sequence diagrams for both the
movement and infrared systems. A structural model of all components and their main interactions was created to
show the overall system, and a state transition model included all of these components for their various run case
situations. These systems were also modeled for redundancy in data flow models. The goal of creating many
diagrams for the coding team was to ensure that they were able to fully understand the system they were to create,
while showing the component interactions from different viewpoints. Because of the extensive costs related to
coding, a good design is critical in reducing costs associated with meetings and interactions between design and
coding team members to clarify models, when this time should be spent by the coding team writing code.
3. Design Engineering
A sound architectural and detailed design of a software system is critical in maintaining a schedule and meeting
budget requirements for an entire project. By designing a system that can be readily developed into useful code, the
V-cycle for software development can seamlessly transition from design to coding, testing and verification and
validation with fewer changes to be made. In the case of requirements changing, design models can be updated and
passed on to the coding team to make appropriate changes, while limiting the time required to implement these
modifications. As with other areas of the software lifecycle, the design method became an iterative process between
design, requirements and coding.
B. Requirements Ratings Table
The design team individually rated the requirements numerically on a scale from not well understood- a good
design cannot be developed (5) to the requirement is well understood and a good design can be developed from it
(1). The team then discussed as a group how well they understood the requirements, and comments were relayed to
the requirements team for revisions. This was an iterative process, after which all of the requirements were
understood by team members. The final ratings are shown in Figure 9, with averages taken and no ratings showing
that the individual believes that they could not create an acceptable design from the requirement.
Department of Aerospace Engineering
11
Figure 9. Requirements Ratings Table
C. Traceability Matrix
Once the design team had determined that they fully understood these requirements, a traceability matrix was
constructed and is presented as Figure 10. It helps to link dependent and reliable requirements in a way such that it is
easier to see how a requirement can effect or change other requirements. Pertaining to design, this helped construct
the data flow and sequence models to guide the commands throughout the system from start to finish. Understanding
the related and dependent requirements also helped to construct the overall system structure diagram.
Figure 10. Traceability Matrix
Department of Aerospace Engineering
12
D. Sequence Diagrams
Sequence diagrams were created for two distinct scenarios. One situation modeled the movement commands
from the user to the robot, while the other showed how the infrared system should interact with the enemy robot.
The following diagrams illustrate a use case for movement and the infrared gun, with other system constraints set in
to show pertinent non-functional requirements.Figure 11 is the sequence diagram the movement control system, and
Figure 12 is the sequence diagram for the IR system.
Figure 11. Sequence Diagram for Movement Control System
Figure 12. Sequence Diagram for IR System
Department of Aerospace Engineering
13
E. State Transition Model
A state transition model shows the various states of components onboard the robot at any instant in time during
which the robot is active. This diagram below shows the movement flow for the required forward, backwards and
left / right rotations, as well as the infrared gun trigger and delay mechanism. The infrared sensor is continuously on
and sensing for hits from the enemy IR gun. This model helps the coding team understand the various functions of
the robot and how they are interconnected. This also shows some of the non-functional requirements such as the two
second delay on firing.
Figure 13. State Transition Model
F. Structural Model
A structural model displays all of the system components with arrows directing the ways in which information is
passed between them. This integrates both hardware and software to show the full system. This is useful for the
coding team to check where information is being passed to, and making sure that these are the correct destinations.
Figure 14. Structural Model
Department of Aerospace Engineering
14
G. Data Flow Diagrams
A structural model is a model that represents the flow of information through a system, with particular emphasis
on its process aspects. A data flow diagram shows what kind of information will be input to and output from a
system, and its various subcomponents. Figure 15 illustrates the data flow model for the IR system, while Figure 13
illustrates the data flow model for movement.
Figure 15. Data Flow Model for IR System
Figure 16. Data Flow Model for Movement
H. Design Conclusions
To fully implement and confirm that the requirements have been met, a well understood design is necessary.
Using different diagrams is needed to explain and document the different component interactions in the robotic
system so that the coding team can develop a robust code that will execute the mission successfully. This was
accomplished through taking time to understand the functional and non-functional requirements set forth from the
customer needs, after which these requirements were examined for their relations and dependencies. From this,
preliminary structure and sequence diagrams were created, which were then revised after communicating with the
coding team to determine their capabilities of developing code. These final diagrams were then split into component
and system models to show how the movement and infrared systems should interact with the robotic vehicle in the
Department of Aerospace Engineering
15
test arena. These detailed sequence diagrams provide the basis for movement and infrared shooting, with listed non-
functional requirements that constrain the overall system. By using both data flow and sequence models, the coding
team is given multiple angles of the final design to aid in their code development. There will be continued
communication between all of the teams included in this project, ultimately using an iterative V-cycle process to
develop a robust design capable of winning the final competition.
V. Coding
Kelvin Nguyen is the lead of the Coding group. Software constructions is related to the creation of working,
useful software. Desirable traits of this software include minimal complexity, anticipation of change, constructs for
testing and verification, and use standards.
Figure 17. Client Code Flow Diagram
A. Client Side Code Summary
A high level flowchart for the client side code is shown in Figure 17. The main function begins by creating an
instance of the Game class, which is where the connection is initialized. Should the initializations fail, the
program closes. Otherwise, the game object begins running and transitions into its main loop.
While the game is running, the code loops through many events. First, the event log and keyboard state are
updated. If the controller is attached, the code will take input from the controller and check for any desired events.
However, if the controller is not attached, the controller pointer is reset and input is taken from the keyboard. Once
the code checks the state of the keyboard or controller, it then fills a 5 element array with the appropriate values to
control the wheels and IR gun of the robot.
Department of Aerospace Engineering
16
Finally it sends the array via the established Wi-Fi connection to the robot. If the IR gun has been fired during
the past two seconds, a reload flag will be set in place to prevent subsequent firings too early, and an appropriately
sized rectangle will be rendered on the GUI to act as a reload bar. Should the client code detect a disconnect from
the Arduino server, a Disconnected image will be rendered in place of the Connected image on the GUI.
B. Detailed Source Code Components
Shown in Figure 18 is a detailed UML sequence diagram of the client side code. The major components of the
code are then detailed.
Figure 18. UML Sequence Diagram
1. Key Variables
The UInt8 sendarray[] variable is a five element array that is sent to the Arduino Wi-fi board. The variable tells
the Arduino to do two different things; move and fire. The first and second elements control the left wheel, while
the third and fourth control the right wheel. The first and third elements control the speed of the wheels, with a
higher value corresponding to a faster speed. These values range from 0 to 255. The second and fourth elements
control the direction of the wheel. If the integer is 1 then the wheel will turn forward, if it is 0 it will turn in the
opposite direction. The last element determines if the IR gun fires if the value is 1, voltage is sent through the IR
gun. Each element is only one byte in size which allows for simple transmission from the client to the Arduino.
The integer time_now is updated at the start of each loop iteration, and is used to keep track of clock and ensures
that events do not happen too often. Futher, the integer time_fired receives updates by using function call to
SDL_GetTicks() when command to fire is sent. This variable is used to prevent firing more often than once every 2.0
seconds. Finally, the integer time_hit is updated by using function call to SDL_GetTicks() when robot detects a hit
event. This variable is used to prevent a single hit from being registered more than 1.5 seconds.
Department of Aerospace Engineering
17
2. Functions
The first function is OnInit(). This section of the code handles initialization of the SDL and Winsock
components. In this function a connection is made between the users computer and the robot. If the connection
succeeds then it continues on with the program. If a connection cannot be established then it will keep attempting to
establish a connection until the user terminates the program. However before a connection is established the function
will begin initializing SDL and create a window on the upper left corner of the computer screen which the client will
use to visually communicate with the user later in the program.
After passing through the OnInit() function the client runs through one of two functions: GetControllerInput()
and GetKeyboardInput(). In this section of the code the system takes the user inputs through whatever medium the
user is using and turns them into values. During each cycle of the main loop the client checks for a controller input
through either the GetControllerInput() or GetKeyBoardInput() functions. If the client detects a controller the client
uses the GetControllerInput() function where it reads in the state of all the buttons and joysticks and turns them into
usable values which is stored in the UInt8 sendarray[] variable. When filling the first slot of the variable the
function adds a bias value to fix a hardware problem present in the robots left wheel. Should a controller not be
connected, then the GetKeyBoardInput() function will be called to fill up the UInt8 sendarray[] variable. Like the
controller input function, this function also maps the state of the keyboard into usable values and adds an additional
bias to the first value in the array.
After the GetControllerInput() and GetKeyBoardInput() functions are called within the main loop the client goes
through the functions OnLoop() and OnRender().In this section of the client code are the functions that contain the
transmission of data to the robot and the rending of the visual the user will use to determine wether or not they will
be able to fire the IR emitter at the given time. Here the function OnLoop() packages the array generated by the
GetControllerInput() and GetKeyBoardInput() functions as well as determines if the IR gun is allowed to be fired. If
the IR gun is allowed to fire the function sends off the array to the robot as is. At the start of the OnLoop() function,
the time_now variable is set by using the SDL_GetTicks() function, which returns the number of milliseconds since
epoch, or when the program started. When a packet is sent, or the IR gun is fired, more function calls to GetTicks are
used to set the values of time_fired and time_sent. Conditional if statements are then used to allow packets to be
sent to the Arduino every 40 ms, as to not overload the server. Another if statement is used to control firing of the
IR gun. If the gun should not be fired, then it changes the value pertaining to the IR gun in the code so it will not fire
(the fifth element of the sendarray[] variable is set to 0). After this, the loop runs through the OnRender() function
which handles rendering of the reload bar and GUI images to the window. Here the function uses the time_now,
time_fire and time_hit to generate SDL visuals that correspond to the robots IR gun and sensor. OnRender() uses
the time_now and time_fire variables to generate a red box which the user will use to determine wether or not they
are allowed to fire the IR gun.
3. Third Party Libraries
The code utilizes two third party libraries for many of its primary functions. The first, SDL, or Simple
DirectMedia Layer, allows for fluid, low-level input from various devices, as well as low-level graphical output.
SDL is a cross-platform library written in C that is used in video playback software, emulators, and even in popular
games from various developers, including Valve. The library has a lot of capabilities, but is used in this code mostly
for keyboard and controller input, and for creating some simple graphical outputs. Because SDL is designed for use
in video games, it was an attractive choice in terms of the desired capabilities of our software; one of the key goals
was to make the control scheme as simple and sensible as possible, and casting it as a game seemed like the best
way to do that.
SDL's primary capability revolves around the use of events. Any time an event is triggered, it is added to the
event log, and the functions can be used to poll for any desired events, while ignoring any others, and resolve
anything that the code should do upon those specific events. Things like key presses, controller button presses,
mouse clicks, and many others are all distinct events that the library can recognize. Another useful capability of
SDL is its ability to retrieve a keyboard or controller state, as opposed to polling for distinct key press or joystick
motion events. That is, the library has functions that return whether or not a key is pressed, or how far a joystick is
moved on either axis, so that input can be obtained based on the state of the device, to allow for a constant stream of
the desired user command. The last capability of SDL our code utilizes is the graphical output. SDL's simplest
graphical capabilities involve rendering simple shapes and image files to a generated window. The code utilizes
these capabilities to display messages, saved as bitmap files, as well as the reloading bar, a simple animation that is
just a series of rendered rectangles increasing in size.
The second third party library, WinSock, is a Windows platform-dependent networking library. There are two
standards for networking in C/C++: for Windows platforms, it is WinSock, while Unix platforms have their own
Department of Aerospace Engineering
18
standard networking libraries. The decision to use WinSock was made for a couple reasons. First, because almost
the entire coding group is running Windows platforms, and getting the UNIX libraries installed and linked properly
on a Windows platform would have been an undesirable amount of trouble. Second, because the tutorials we found
for WinSock were easier to follow, especially considering none of us have experience in networking. The WinSock
library is used for all the networking functionality of the client code. The connect(), send(), and recv() functions
form the basic networking capabilities required for the software. Winsock is used to create and bind a socket,
change the settings so that communication works the way it needs to, and send and receive packets on that socket.
C. Arduino Side Code
It is now possible to review the coding for the Arduino side code.
1. Key Global Variables
The char ssid[] is the Wifi network name. Additionally, the char pass[] contains the Wifi Password. The byte
in[] is a 5 byte array to receive and store incoming packets. The first four bytes control direction and speed of each
motor, while the 5th byte controls the state of the IR gun. Finally, the variables Time_now, time_fired, and time_sent
ar used to keep track of current time and time send and firing occurred; all of these variables were initialized to 0.
2. setup()
During the setup() loop, the interrupts for the IR receiver are declared. This section of code was provided by
Vidullan, our Graduate Teaching Aide (TAide), to ensure fair and symmetric hit detection between both robots. The
middle part of the setup() loop declares the motor pin and IR pin as outputs to control the motor and IR gun. The
remainder of the setup function contains a while statement, which attempts to connect to the declared Wifi network
using the defined password. Once a connection is established, the setup() function is complete.
3. loop()
At the start of each loop, the millis() function is called to set the value of time_now to be the time, in
milliseconds, since epoch. Like on the client code, this allows the Arduino to control when the board checks for
new data, and when to allow the IR gun to fire. Whenever a new packet is received, or the IR gun is fired, calls to
millis() are used to store the time the event occurred. Every 50 milliseconds the Arduino Wifi-shield checks for a
packet from the client code. Once a packet is detected, calls to client.read() are used to read and store control
information into the in[] array. Once the array is filled, the MotorControl() function is called to command the
wheels to move at the specified speed and direction. Each loop iteration, a hit flag is used to determine whether or
not the robot is being hit by the opposing IR gun. The criteria for this hitflag returning true is built into the code
provided by Vidullan. A conditional if statement is used to trigger an event, such as lighting an LED or freezing the
robot, to acknowledge the hit.
Figure 19. Arduino Wiring Diagram
Department of Aerospace Engineering
19
D. Wiring Diagram
Figure 19 details the various wiring connections on the robot. The Arduino UNO and Wifi Shield are connected
directly on every pin. A few pins had to be rewired between the Wifi Shield and Motor Shield, via the blue and
green wires, because both shields are designed to use them by default. The Arduino code is written for all these
specific pin connections, so any changes made in the hardware must be mirrored in the software for the system to
work properly.
E. Coding Conclusions
In the end, the client code is shown to be effective at commanding the Arduino robot through a TCP Wi-Fi
connection. At a high level, operation of the robot is very simple, as there are only five degrees of freedom
associated with the robot the direction of each wheel, the speed of each wheel, and the state of the IR gun.
Commands (the 5 valued array) are simply sent to the robot at continuous, set intervals, and based off the values in
the array, the controllable components of the robot the two wheels and IR gun are commanded. The infrared
sensor drives an interrupt, which flips a flag when triggered to perform the penalty action turning on the LED and
freezing the robot momentarily.
VI. Testing
Steve Stanek is the lead of the Testing group. The primary goal of testing should not only be to identify and
correct errors but also to identify the root causes of errors and modify the software development process so that
current error trends do not continue. It should be noted that testing can only show the presence of errors, it cannot
show the absence of errors. It is possible for the developed software to have minimal errors, yet still not fulfill the
users needs. Organizations should also keep in mind that cutting the amount of time allowed for testing not only
increases the number of remaining errors but also eliminates the chance fix those errors.
A. Reflections and Lessons Learned
There are eight distinct sections of testing. These sections include, component, static, dynamic, unit, integration,
system, defect, and release testing. Though testing cannot show the absence of errors, through these different testing
procedures, the end product should perform as best as possible.
1. Component Testing
Component tests are designed to ensure that the individual components of the system are functioning properly in
isolation from the rest of the system. During the process, the system is broken down in to smaller components.
These components should be the smallest testable parts of an application. Components may be individual functions,
object classes or composite components. The individual units are then tested to expose defects. An example of how
component testing was utilized in this project includes the testing of the motor and the unequal spin rates of the two
wheels. To fix this, a gain was added to ensure the wheels spun at equal rates.
2. Static Testing
Static testing is testing the object when it is not in use. Static testing is mainly performed by visually scanning
the code or using software that helps debug the code. During static testing, hardware and integration aspects are
ignored and the code is examined for errors. During the addition of several bits of code, multiple sets were utilized
to ensure the code that was inputting was correct. This could include any time constraint, such as the two second
delay between firing the IR gun.
3. Dynamic Testing
Dynamic testing is a crucially vital process in successfully delivering a product to the consumer. Dynamic
testing is executed while a program is in operation. Typically it will look at system memory, functional behavior,
response time, and overall performance of the system. The main goal of the tests is to execute a program with the
sole intent to find where errors occur. The way in which this is carried out is by running a certain case with a known
outcome. This known outcome then is compared to the outcome from the system. If the two outcomes are different,
then there is an error in the system. When planning out dynamic testing, it is important to design a test in which
errors are systematically discovered.
An example of a dynamic test includes the testing of the IR gun and the reload bar found in the command
window. The requirements states that the IR gun cannot fire more than once every two seconds. Because of this, the
Department of Aerospace Engineering
20
IR gun and the reload bar should recharge at the same time. This was tested in two ways. First, each test was tested
individually. They were both looked at visually and the reload time or the flash of the IR gun, seen using a digital
camera, was measured via a stop watch. Once that test was completed, they were tested dynamically, together. To
test this, the flash and recharged reload bar were put side by side and examined. Multiple shots were fired
consecutively and shown that they were indeed both on the same two second recharge.
4. Unit Testing
The purpose of unit testing is to divide the software up into separately testable units and verify their functionality
in isolation from one-another. If possible, unit testing should not be done by the developers. They may be not feel
inclined to honestly identify errors in the software. There are two different types of unit testing known as white-box
testing and black-box testing. During white-box testing, the unit testers have access to the unit's code. This helps the
testers plan their test cases and also allows them to verify that the unit is doing what it was designed to do. Black-
box testing involves testing how the unit works when confronted by typical input. Due to the large number of
possible inputs, complete black-box testing is impossible. It is also important to note that black-box testing also
requires the testing of invalid inputs. In this case, with the program running, multiple key strokes that should have
no binding were utilized to test to see if they affected the system in any way.
5. Integration Testing
The purpose of integration testing is to verify that the individual units of the software are working in
conjunction with each in the way they were designed. Integration testing focuses on the interfaces
between various units of the software since these tend to be a problem area in development. Things such
as parameters and global variables as well as all possible interactions between units must be examined
and tested. When testing large and complex software, it is usually a bad idea to test the system by putting
all the components together at once. In these cases is makes more sense to integrate the components
incrementally. In relation to this project, first the motion was tested, followed by the Wi-Fi camera feed,
incrementally up to the IR gun and sensor.
6. System Testing
The purpose of system testing is to verify that the system is working as a whole. It is also useful for determining
if the system meets its non-functional requirements. At this point in testing, the majority of errors should have
already been identified and dealt with in previous stages. For our case, once assembled and fully integrated, the
robot was tested as a whole to ensure the individual tests held up.
7. Defect Testing
The definition of defect testing inherently means that it is never truly finished. The absence of errors does not
mean that they do not exist, only that they have not yet been discovered. While testing the robot, expected error
messages were displayed on GUI with specified actions, but no unforeseen errors occurred. The testing process for
this class of defects will continue until the product is released, and is the responsibility of all team members to track.
Proper communication between the different groups is vital in order to ensure that proper defect testing is
administered.
8. Release Testing
Before a software product is released for public use, release testing is performed on the software to ensure that it
is error free, safe for use, and has met the customers needs. Before the product is released to the customer, the
robots functionality will be tested and the prior tests evaluated to ensure they have all been completed properly.
B. Testing Highlights
This section highlights each components description, purpose, characteritics, testing procedure and the testing
results.
1. The vehicle must be able to execute all commands given from the controller
Purpose of the Part
- The controller is what the user will use to control the robot and enable proper command input to enable
victory in the final competition.
Department of Aerospace Engineering
21
- Single inputs should be able to make the robot rotate left/right, move forward/backward, and fire an IR
gun.
Characteristics of the Part
- Utilizing the keyboard on a laptop there are designated keys for left, right, forward, and backward motion
of the robot. The Xbox 360 controller utilizes the left joystick for forwards and backwards motion with the right
joystick utilized for rotation.
- The spacebar and right trigger are utilized to fire a single IR gun shot.
- The up and down arrow keys, as well as the right and left bumper, are utilized to change the speed of the
robot increasingly or decreasingly, respectively.
Procedure
1. Start up the robot following the normal start up procedures.
2. Run the robot through a series of commands: Left/Right rotations, forward/backward movement, and fire
the IR gun.
3. Make note of any discrepancies or malfunctions between the robots action and the command window.
Results
1. Initially the forward motion of the robot was impeded due to each wheel rotating at different rates.
a. Adjustments were made and the wheels were given different gains and now rotate at the same rate.
i. The robot is now able to move forwards or backward in a straight line.
2. On occasion the robot would become stuck in a rotating state
a. Adjustments were made in the code before secondary testing of the rotation.
i. The robot now only gets stuck in a rotation if it becomes disconnected from the controller.
3. The IR gun fires once for each suppression of the spacebar.
2. Command Window
Purpose of the Part
- The command window will be used by the controller to send signals to the robot that will enable it to
move, fire the IR gun, and transmit video.
- The user command window should utilize single keys for movement and fire commands.
- The command window will output necessary messages.
Characteristics of the part
- The command window should appear in an aesthetically pleasing manner to the user on the operating
laptop.
- The command window should show the timing it takes to fire the IR gun so that the user can see when
they are able to fire.
Procedure
1. Start up the robot following the normal start up procedures.
2. Run the robot through a series of commands: Left/Right rotations, forward/backward movement, and
fire the IR gun.
3. Make note of any discrepancies or malfunctions between the robots action and the command window.
Results
1. There were no discrepancies between the input commands from the window and the robots actions.
2. When the robot is online the window will read connected. If the robot disconnects the connected
message will be replaced by disconnected. Furthermore, the command window displays a hit message
when we are hit.
Department of Aerospace Engineering
22
3. Wi-Fi Camera and the video feed that is provided to the user; video speed/lag and a basic level of clarity are the
main focus of the testing
Purpose of the Part
- The primary purpose of the part is to provide a reliable source of visual data to the user so that proper
commands can be given in order to complete a specified mission objective
Characteristics of the Part
- The camera provides a basic video feed via the Wi-Fi board to the user oriented computer
- The frame rate and video quality can be varied, and inherently affect the performance of the robot
- Required power vs. video quality is a major focus due to the limit of available amperage and voltage
Procedure
1. The robot will be powered up and set to normal operating parameters to initialize the testing process
2. Video feed will be started with the robot in a stationary form.
3. Physical objects will be introduced into the field of view of the camera and quickly removed. Upon
moving them, time will be kept to determine the lag time of the video feed.
4. In addition to the feed timing, any issues such as black-outs or feed drop will be noted.
5. The entire process will be repeated with the robot in motion to ensure that it can continue proper
functions while in a dynamic state.
6. A frame rate count is determined using the computing functions of the robot, not from a direct test.
Results
1. The camera functioned properly and provided a basic visual feed that the user could use to direct the
robot.
2. One issue that arose was a considerably video lag. An overall lag of 5.2 seconds occurred. It was
determined that this was attributed to the high frame-rate provided to the user. This was corrected by
lowering the frame rate to 10 FPS.
3. Required power for the video unit was not a problem. The battery unit supplied enough power that the
robot could continue dynamic functions and still provide the required video feed.
4. The vehicle should have a manual on/off switch for the batteries and the camera
Purpose of the Part
- The manual on/off switch will enable the robot to be disconnected and powered down separately from
the controller.
- Having a backup option to power down the camera and robot will allow for a failsafe in case there are
any issues with the controller during runtime.
Characteristics of the Part
- There is a manual toggle switch located on the robot that allows for power to be turned on and off.
Procedure
1. Start up the robot following the normal start up procedures.
2. Test the on/off switch by changing from the on position to the off position.
3. Make note of any discrepancies or malfunctions between the robot and the command window.
Results
1. The on/off switch properly controls the robots power as well as the cameras power.
5. The vehicle must be able to operate on linoleum floors
Purpose of the Part
- The tires and power supplied to them must be enough so that the vehicle can navigate in a controlled
and efficient manner on a linoleum floor.
Department of Aerospace Engineering
23
Characteristics of the Part
- The tires are made up of a standard rubber material with standard all-purpose treading.
Procedure
1. Start up the robot following the normal start up procedures.
2. Run the robot through a series of commands: Left/Right rotations, forward/backward movement all on a
linoleum floor.
3. Make note of any discrepancies or malfunctions between the robots action and the command window.
Results
1. The robot was able to execute all commands efficiently and in a controlled manner on the linoleum
floor.
6. IR gun
Purpose of the Part
- The Infrared gun will be utilized to hit the opposing target that is equipped with IR sensors. The IR gun
is not allowed to fire more than once every 2 seconds.
Characteristics of the Part
- An IR beam cannot be focused to pinpoint accuracy. Due to this, the part will be tested to show the
range and cone of the IR shot. By measuring this, the user will be given proper knowledge to
successfully hit their target.
- The beams often can often become unaligned. Due to this, the IR beam will be continually tested in
regular intervals to monitor this.
- Like hockey, the wall has potential to be your friend. In hockey, players use the wall as a way to deflect
pucks around other players or to create a passing lane. An IR beam can act the same way. It will be
tested to see the reliability of bouncing the IR beam off a wall to the opponent.
Procedure
1. To ensure the functionality of the IR gun, our own receiver will be used to detect hits or not.
2. To give us the upper hand, the characteristics of the part will be tested and monitored without utilizing
the other team. In the room that will host the competition, the IR gun will be used as various location to
bounce off the walls. This test will either confirm or debunk the utilization of walls.
3. Twice per week the IR gun will be tested to ensure the alignment is correct. A more in depth description
of the procedure to this will become available at a later date.
Results
1. Prior to the update of the firmware, it was determined that the IR gun is functional.
a. After the firmware update, the IR gun remains functional.
2. It was determined that the utilization of walls is not favorable.
7. IR receiver; the IR receiver is composed of three sensors, located at the front, left, and right side of the robot
C++Files:
Game.cpp
GameCleanup.cpp
GameGetControllerInput.cpp
GameGetKEyboardInput.cpp
GameLogSDLError.cpp
GameOnInit.cpp
GameOnLoop.cpp
GameOnRender.cpp
Main.cpp
WifiConnection.cpp
WifiConnectionConnect.cpp
WifiConnectionInitialize.cpp
WifiConnectionLogWSAError.cpp
WifiConnectionQuit.cpp
WifiConnectionSendData.cpp
/**WifiConnectionHeaderFile
ContainstheinformationabouttheWifiConnectionclass
*/
#ifndefWIFICONNECTION_H_INCLUDED
#defineWIFICONNECTION_H_INCLUDED
#include"header.h"
classWifiConnection
{
private:
friendclassgame;//allowsthegameclasstoaccessallprivatedataoftheWifiConnectionclass
u_shortPORT;//portnumber
constchar*IP_ADDRESS;//characterarraythatholdsIPaddress
WSADATAwsa;//neededforWinSockinitialization
SOCKETs;//socket
structsockaddr_inserver;//serverstructthatholdsIPandportinformation
Uint8sendarray[5];//controldataarray
Uint8arraysent[5];//storesdatajustsent
char*ptr_sendarray;//pointertocontroldata
boolconnected;
charrecvarray;
char*ptr_recvarray;
public:
WifiConnection();
voidlogWSAError(ostream&os,conststring&msg);
boolInitialize();
boolConnect();
boolSendData();
voidQuit();
boolReceiveHit();
};
#endif
/**GameHeaderFile
Containstheinformationaboutthegameclass,andalsofunnelsalltheheaderlibrariesintoone
place
AlsocontainsaWifiConnectionobjectthathandlesallthenetworkingcapabilities
*/
#ifndefGAME_H_INCLUDED
#defineGAME_H_INCLUDED
#include"header.h"
#include"WifiConnection.h"
constintSCREEN_WIDTH=400;//widthofthegamewindow
constintSCREEN_HEIGHT=100;//heightofthegamewindow
constSDL_RectCONRECT={10,50,97,24};
constSDL_RectDISRECT={10,50,121,24};
constSDL_RectHITRECT={10,10,44,24};
classgame
{
private:
boolRunning;//definesifthegameisrunning
SDL_Window*window;//pointerforthegamewindow
SDL_Renderer*renderer;//pointerforthewindowrenderer
SDL_Surface*loadSurface;
SDL_Texture*TextConnect;
SDL_Texture*TextDisconnect;
SDL_Texture*TextHit;
SDL_EventEvent;//variabletoholdeventdata
SDL_GameController*controller;//pointerforthegamecontroller
constUint8*keystate;//pointer(usedasanarray)forthekeystateinformation
Uint8speed,max_speed;//initialspeed
Uint8left_bias,right_bias;
Sint16AxisState_Left,AxisState_Right,AxisState_Trigger;//stateofcontrolleraxes
inttime_fire,time_now,time_sent,time_hit;//timeoflastfiringandcurrenttime
WifiConnectionConnection;//WifiConnectionobject
intdeadzone;//Deadzoneforgamepads
boolfiring;
boolreloading;
boolhit;
inttime_nowms,time_firems;
public:
game();
voidlogSDLError(ostream&os,conststring&msg);
intOnExecute();
boolOnInit();
boolLoadContent();
voidGetKeyboardInput();
voidGetControllerInput();
voidOnEvent(SDL_Event*Event);
voidOnLoop();
voidOnRender();
voidCleanup();
};
#endif//GAME_H_INCLUDED
/**HeaderFile
Containsallthelibraries(andperhapslater,functionprototypes,butweshouldtrytokeepthosein
arelevantclassstructure)
thattheprogramneedstorun.Inordertocompileandrunthisprogram,SDL2.0andWinsock2are
alsorequired,andtheirrespective
librariesmustbelinkedinyourIDE.SDLalsohasa.dllfilethatmustbeintheprojectfolderforthe
executabletorun.
*/
#ifndefHEADER_H_INCLUDED
#defineHEADER_H_INCLUDED
#include<iostream>
#include<SDL.h>
#include<winsock2.h>
#include<stdint.h>
#include<ctime>
usingnamespacestd;
#endif//HEADER_H_INCLUDED
/**GameConstructorandExecutionLoop
Containsboththeconstructorforgame,whichinitializestherequiredvariables,
andthesimpleloopthatcontrolstheorderinwhichthingsoccurandupdateduringthegame.
UPDATE:Eliminateddelay.Shouldbebalancedbythefactthatthecodeonlysendsanewarray
whenitchanges,butwe'llhavetokeepaneyeonthis.
*/
#include"Game.h"
game::game()//gameconstructor
{
window=NULL;
renderer=NULL;
loadSurface=NULL;
TextConnect=NULL;
TextDisconnect=NULL;
TextHit=NULL;
controller=NULL;
keystate=SDL_GetKeyboardState(NULL);//initializesakeyboardstatethatcanbeupdatedlater
speed=100;
max_speed=150;//initialspeedforkeyboardinput
left_bias=right_bias=0;
Running=true;//thegameisnowrunning
firing=false;
reloading=false;
hit=false;
time_fire=0;
time_sent=0;
time_hit=0;
deadzone=6000;
intgame::OnExecute()
{
if(OnInit()==false)//initializeseverythingneededtorunthegame
{
return1;//ifinitializationfailed,closetheprogram
}
while(Running)//loopsaslongasgameisrunning
{
SDL_PumpEvents();
if(SDL_GameControllerGetAttached(controller)==SDL_TRUE)//getcontrollerinputifcontroller
isattached
{
GetControllerInput();
}
else
{
controller=NULL;//otherwise,resetcontrollerpointer(incaseit'sbeendisconnected)
GetKeyboardInput();//andgetkeyboardinput
}
while(SDL_PollEvent(&Event))//checktheeventlogforanydesiredevents
{
OnEvent(&Event);
}
OnLoop();//sendsdata
OnRender();
}
Cleanup();//oncegamestopsrunning,cleanupeverything
return0;
}
/**GameCleanup
Beforethegamequits,thisfunctioncleansallthedataandprocessesthatwereinitializedatthe
start,includingthenetworkingroutines
*/
#include"Game.h"
voidgame::Cleanup()
{
Connection.Quit();//quitthenetworkingroutines
SDL_GameControllerClose(controller);//closethegamecontroller
SDL_DestroyRenderer(renderer);//closetherenderer
SDL_DestroyWindow(window);//closethegamewindow
SDL_Quit();//quitSDL
}
/**GameControllerInput
Checksthestateofalldesiredcontrolleraxesandsetsdatavaluesthatcorrespondtocurrentstate
Inthefuture,controllerbuttonscanalsobemappedforcertainpurposes
*/
#include"Game.h"
voidgame::GetControllerInput()
{
Connection.sendarray[0]=0;
Connection.sendarray[1]=0;
Connection.sendarray[2]=0;
Connection.sendarray[3]=0;
AxisState_Right=SDL_GameControllerGetAxis(controller,SDL_CONTROLLER_AXIS_RIGHTX);
AxisState_Left=SDL_GameControllerGetAxis(controller,SDL_CONTROLLER_AXIS_LEFTY);
if((AxisState_Right>Sint16(deadzone)))
{
Connection.sendarray[0]=speed+(30/130.0)*speed;
Connection.sendarray[1]=Uint8(0);
Connection.sendarray[2]=speed;
Connection.sendarray[3]=Uint8(1);
}
elseif((AxisState_Right<Sint16(deadzone)))
{
Connection.sendarray[0]=speed+(30/130.0)*speed;
Connection.sendarray[1]=Uint8(1);
Connection.sendarray[2]=speed;
Connection.sendarray[3]=Uint8(0);
}
//theleftjoystickyaxisispositivedown,negativeup
if((AxisState_Left>Sint16(deadzone)))//itcontrolstheleftmotor
{
Connection.sendarray[0]=speed+(30/130.0)*speed;
Connection.sendarray[1]=Uint8(1);
Connection.sendarray[2]=speed;
Connection.sendarray[3]=Uint8(1);
}
elseif((AxisState_Left<Sint16(deadzone)))
{
Connection.sendarray[0]=speed+(30/130.0)*speed;
Connection.sendarray[1]=Uint8(0);
Connection.sendarray[2]=speed;
Connection.sendarray[3]=Uint8(0);
}
if(AxisState_Left<Sint16(deadzone)&&AxisState_Right>Sint16(deadzone))
{
Connection.sendarray[0]=speed+(30/130.0)*speed;
Connection.sendarray[1]=Uint8(0);
Connection.sendarray[2]=(6.5/10.0)*speed;
Connection.sendarray[3]=Uint8(0);
}
elseif(AxisState_Left<Sint16(deadzone)&&AxisState_Right<Sint16(deadzone))
{
Connection.sendarray[0]=(6.5/10.0)*(speed+(30/130.0)*speed);
Connection.sendarray[1]=Uint8(0);
Connection.sendarray[2]=speed;
Connection.sendarray[3]=Uint8(0);
}
if(AxisState_Left>Sint16(deadzone)&&AxisState_Right>Sint16(deadzone))
{
Connection.sendarray[0]=speed+(30/130.0)*speed;
Connection.sendarray[1]=Uint8(1);
Connection.sendarray[2]=(6.5/10.0)*speed;
Connection.sendarray[3]=Uint8(1);
}
elseif(AxisState_Left>Sint16(deadzone)&&AxisState_Right<Sint16(deadzone))
{
Connection.sendarray[0]=(6.5/10.0)*(speed+(30/130.0)*speed);
Connection.sendarray[1]=Uint8(1);
Connection.sendarray[2]=speed;
Connection.sendarray[3]=Uint8(1);
}
AxisState_Trigger=SDL_GameControllerGetAxis(controller,SDL_CONTROLLER_AXIS_TRIGGERRIGHT);
//therighttriggercontrolsthefiringmechanism
if((AxisState_Trigger>Sint16(deadzone)))
{
Connection.sendarray[4]=Uint8(1);
}
else
{
Connection.sendarray[4]=Uint8(0);
}
for(inti=0;i<5;i++)//printsoutcontroldatafordebugging
{
cout<<int(Connection.sendarray[i])<<endl;
}
}
/**GameKeyboardInput
Checksthestateofalldesiredkeysandsetsdatavaluesthatcorrespondtokeyspressed
*/
#include"Game.h"
voidgame::GetKeyboardInput()
{
if(keystate[SDL_SCANCODE_UP])//pressingtheuparrowkeyincreasesthespeed,uptoamaxof
130
{
if(speed<Uint8(max_speed))
{
speed++;
}
}
elseif(keystate[SDL_SCANCODE_DOWN])//pressingthedownarrowkeydecreasesthespeed,
downtoaminof0
{
if(speed>Uint8(0))
{
speed;
}
}
else{}
//oftheWASDkeys,whenmorethanoneispressed,onlyonewilldeterminethecontroldata
//itmaybehelpfulinthefuturetodefinescenariosinwhichmultiplekeysarepressed
if(keystate[SDL_SCANCODE_W])//Wmovesforwardandhashighestpriority
{
Connection.sendarray[0]=speed+left_bias;
Connection.sendarray[1]=Uint8(0);
Connection.sendarray[2]=speed+right_bias;
Connection.sendarray[3]=Uint8(0);
}
elseif(keystate[SDL_SCANCODE_S])//Smovesbackwardandhassecondpriority
{
Connection.sendarray[0]=speed+left_bias;
Connection.sendarray[1]=Uint8(1);
Connection.sendarray[2]=speed+right_bias;
Connection.sendarray[3]=Uint8(1);
}
elseif(keystate[SDL_SCANCODE_A])//Aturnsleftandhasthirdpriority
{
Connection.sendarray[0]=speed+left_bias;
Connection.sendarray[1]=Uint8(1);
Connection.sendarray[2]=speed+right_bias;
Connection.sendarray[3]=Uint8(0);
}
elseif(keystate[SDL_SCANCODE_D])//Dturnsrightandhasfourthpriority
{
Connection.sendarray[0]=speed+left_bias;
Connection.sendarray[1]=Uint8(0);
Connection.sendarray[2]=speed+right_bias;
Connection.sendarray[3]=Uint8(1);
}
else//ifnonearepressed,zerovaluemovementcommandsaresent
{
Connection.sendarray[0]=Uint8(0);
Connection.sendarray[1]=Uint8(0);
Connection.sendarray[2]=Uint8(0);
Connection.sendarray[3]=Uint8(0);
}
if(keystate[SDL_SCANCODE_SPACE])//thespacebarfirestheIRgun
{
Connection.sendarray[4]=Uint8(1);
}
else//ifnotpressed,gunwillnotfire
{
Connection.sendarray[4]=Uint8(0);
}
for(inti=0;i<5;i++)//printsoutcontroldatafordebugging
{
cout<<int(Connection.sendarray[i])<<endl;
}
}
/**GameLogSDLError
LoganSDLerrorwithsomeerrormessagetotheoutputstreamofourchoice
@paramosTheoutputstreamtowritethemessageto
@parammsgTheerrormessagetowrite,formatwillbe"<msg>error:<SDL_GetError()>"
*/
#include"Game.h"
voidgame::logSDLError(ostream&os,conststring&msg)
{
os<<msg<<"error:"<<SDL_GetError()<<endl;
}
/**GameEvent
Thisfunctionreceivesaneventoffthetopoftheeventlogandchecksifitisonethatweare
interestedin
ItalsohandlestheconnectionofaUSBgamecontroller
*/
#include"Game.h"
voidgame::OnEvent(SDL_Event*Event)
{
if(Event>type==SDL_QUIT)//ifthegamewindowisclosed,thegamewillstoprunning
{
Running=false;
}
if(Event>type==SDL_CONTROLLERDEVICEADDED)//ifacontrollerisconnected,itwillbe
mappedandusedinlateriterations
{
cout<<"Controllerconnected.\n";
for(inti=0;i<SDL_NumJoysticks();i++)
{
if(SDL_IsGameController(i))
{
controller=SDL_GameControllerOpen(i);
if(controller)
{
cout<<"ControllerName:"<<SDL_GameControllerName(controller)<<endl;
break;
}
else
{
logSDLError(cout,"SDL_GameControllerOpen");
}
}
}
}
if(Event>type==SDL_CONTROLLERBUTTONDOWN)
{
if(Event>cbutton.button==SDL_CONTROLLER_BUTTON_LEFTSHOULDER)
{
if(max_speed>=5)
{
max_speed=max_speed5;
if(speed>max_speed)
{
speed=max_speed;
}
}
cout<<"LEFTSHOULDER\n";
}
if(Event>cbutton.button==SDL_CONTROLLER_BUTTON_RIGHTSHOULDER)
{
max_speed=max_speed+5;
speed=max_speed;//changethislater
cout<<"RIGHTSHOULDER\n";
}
}
}
/**GameInitialization
Initializesalltherequiredprocessestorunthegame,includingcreatingawindowandrenderer,
andestablishinganetworkconnection
*/
#include"Game.h"
boolgame::OnInit()
{
if(SDL_Init(SDL_INIT_EVERYTHING)<0)//initializeSDL
{
logSDLError(cout,"SDL_Init");
returnfalse;
}
if((window=SDL_CreateWindow("SDLRenderClear",100,100,SCREEN_WIDTH,SCREEN_HEIGHT,
SDL_WINDOW_SHOWN))==NULL)//createawindownearthetopleftofthescreen
{
logSDLError(cout,"SDL_CreateWindow");
returnfalse;
}
renderer=SDL_CreateRenderer(window,0,SDL_RENDERER_ACCELERATED|
SDL_RENDERER_PRESENTVSYNC);//createarendererforthatwindowsothatimagescanbe
displayedlater
if(renderer==NULL)
{
logSDLError(cout,"SDL_CreateRenderer");
returnfalse;
}
if(Connection.Initialize()==false)//initializenetworkingfunctions
{
returnfalse;
}
if(Connection.Connect()==false)//connecttonetwork
{
returnfalse;
}
for(inti=0;i<SDL_NumJoysticks();i++)
{
if(SDL_IsGameController(i))
{
controller=SDL_GameControllerOpen(i);
if(controller)
{
cout<<"ControllerName:"<<SDL_GameControllerName(controller)<<endl;
break;
}
else
{
logSDLError(cout,"SDL_GameControllerOpen");
}
}
}
loadSurface=SDL_LoadBMP("Connected.bmp");
if(loadSurface==NULL){
logSDLError(cout,"SDL_LoadBMP_Connect");
returnfalse;
}
TextConnect=SDL_CreateTextureFromSurface(renderer,loadSurface);
if(TextConnect==NULL){
logSDLError(cout,"SDL_CreateTextureFromSurface_Connect");
returnfalse;
}
SDL_FreeSurface(loadSurface);
loadSurface=NULL;
loadSurface=SDL_LoadBMP("Disconnected.bmp");
if(loadSurface==NULL){
logSDLError(cout,"SDL_LoadBMP_Disconnect");
returnfalse;
}
TextDisconnect=SDL_CreateTextureFromSurface(renderer,loadSurface);
if(TextConnect==NULL){
logSDLError(cout,"SDL_CreateTextureFromSurface_Disconnect");
returnfalse;
}
SDL_FreeSurface(loadSurface);
loadSurface=NULL;
loadSurface=SDL_LoadBMP("Hit.bmp");
if(loadSurface==NULL){
logSDLError(cout,"SDL_LoadBMP_Hit");
returnfalse;
}
TextHit=SDL_CreateTextureFromSurface(renderer,loadSurface);
if(TextConnect==NULL){
logSDLError(cout,"SDL_CreateTextureFromSurface_Hit");
returnfalse;
}
SDL_FreeSurface(loadSurface);
loadSurface=NULL;
returntrue;
}
/**GameLoop
Thisfunctioncontainsanyprocessesthatneedtobeupdatedeachiteration,includingcheckingthe
timesincelastfire,
andsendingthecontroldata
*/
#include"Game.h"
voidgame::OnLoop()
{
time_now=SDL_GetTicks();//takesthecurrenttimeinms
if(Connection.sendarray[4]==1&&!reloading)//ifmorethantwosecondshavepassedsincelast
firing,
{
cout<<"FIRING"<<endl;
firing=true;
reloading=true;
time_fire=SDL_GetTicks();
}
if(time_nowtime_fire>500){
firing=false;
}
if(time_nowtime_fire>2000){
reloading=false;
}
if(firing){
Connection.sendarray[4]=1;
}
if(!firing)
{
Connection.sendarray[4]=0;
}
if((time_nowtime_sent)>100){
if(Connection.SendData()==false)//sendscontroldata
{
Connection.connected=false;
cout<<'\a';
closesocket(Connection.s);
Connection.s=socket(AF_INET,SOCK_STREAM,0);
Connection.Connect();
}
time_sent=SDL_GetTicks();
}
if(Connection.ReceiveHit()){
if((time_nowtime_hit)>1000){
hit=true;
time_hit=SDL_GetTicks();
cout<<'\a';
}
}
if((time_nowtime_hit)>1000){
hit=false;
}
}
/**GameRender
Thisfunctioncontrolswhat,ifanything,isrenderedinthegamewindow
Itisusedtoinformtheuserofbeinghit,aswellasdisplayingthereloadtime
*/
#include"Game.h"
voidgame::OnRender()
{
if(reloading){
//Drawsandredrawsagrowingrectanglebetween0and2000msoffiring.
SDL_Rectr;
SDL_SetRenderDrawColor(renderer,0,0,0,255);
SDL_RenderClear(renderer);
r.x=75;
r.y=10;
r.w=(time_nowtime_fire)/10.0;
r.h=25;
SDL_SetRenderDrawColor(renderer,255,0,0,255);
SDL_RenderDrawRect(renderer,&r);
}
if(!reloading){
SDL_SetRenderDrawColor(renderer,0,0,0,255);
SDL_RenderClear(renderer);
if(time_nowtime_fire>2200){
SDL_SetRenderDrawColor(renderer,0,0,0,255);
SDL_RenderClear(renderer);
if(Connection.connected){
SDL_RenderCopy(renderer,TextConnect,NULL,&CONRECT);
}
if(!Connection.connected){
SDL_RenderCopy(renderer,TextDisconnect,NULL,&DISRECT);
}
if(hit){
SDL_RenderCopy(renderer,TextHit,NULL,&HITRECT);
}
SDL_RenderPresent(renderer);
}
/**MainRoutine
CreatesagameobjectcalledtheGame,whichalsorunsitsconstructor,andthenstartsthegame.
*/
#include"Game.h"
intmain(intargc,char*argv[])//SDLRequiresmain()totaketheseparameters
{
gametheGame;
returntheGame.OnExecute();
}
/**WifiConnectionConstructor
InitializesnecessarydatafortheWifiConnectionobject
*/
#include"Game.h"
#include<string.h>
WifiConnection::WifiConnection()
{
stringip_addr;
cout<<"InputIPaddress:";
cin>>ip_addr;
PORT=5001;
IP_ADDRESS=ip_addr.c_str();
ptr_sendarray=(char*)(&sendarray);
ptr_recvarray=(char*)(&recvarray);
arraysent[0]=0;
arraysent[1]=0;
arraysent[2]=0;
arraysent[3]=0;
arraysent[4]=0;
connected=false;
server.sin_addr.s_addr=inet_addr(IP_ADDRESS);
server.sin_family=AF_INET;
server.sin_port=htons(PORT);
}
/**WifiConnectionConnect
ConnectstothenetworkIPandportdefinedintheconstructor
*/
#include"Game.h"
boolWifiConnection::Connect()
{
if(connect(s,(structsockaddr*)&server,sizeof(server))<0)
{
logWSAError(cout,"connect");
returnfalse;
}
u_longiMode=1;
ioctlsocket(s,FIONBIO,&iMode);//Setssockettononblockingmode
setsockopt(s,IPPROTO_TCP,TCP_NODELAY,ptr_sendarray,1);//DeactivatesNaglealgorithm
cout<<"Connected\n";
connected=true;
returntrue;
}
/**WifiConnectionInitialize
InitializesalltheWinSockfunctionsandprocessesnecessaryfornetworking,andcreatesasocket
*/
#include"Game.h"
boolWifiConnection::Initialize()
{
cout<<"InitializingWinsock_Version2...";
if(WSAStartup(MAKEWORD(2,2),&wsa)!=0)
{
logWSAError(cout,"WSAStartup");
returnfalse;
}
cout<<"Initialized.\n";
if((s=socket(AF_INET,SOCK_STREAM,0))==INVALID_SOCKET)
{
logWSAError(cout,"socket");
returnfalse;
}
cout<<"Socketcreated.\n";
returntrue;
}
/**WifiConnectionLogWSAError
LogaWSAerrorwithsomeerrormessagetotheoutputstreamofourchoice
@paramosTheoutputstreamtowritethemessageto
@parammsgTheerrormessagetowrite,formatwillbe"<msg>error:<WSAGetLastError()>"
*/
#include"Game.h"
voidWifiConnection::logWSAError(ostream&os,conststring&msg)
{
os<<msg<<"error:"<<WSAGetLastError()<<endl;
}
/**WifiConnectionQuit
Endstheconnection,closesthesocket,andquitsalltheWinsockprocesses
*/
#include"Game.h"
voidWifiConnection::Quit()
{
ptr_sendarray=NULL;
closesocket(s);
WSACleanup();
}
/**WifiConnectionSendData
Sendsthecontroldataoverthecreatednetworkconnection
/
#include"Game.h"
boolWifiConnection::SendData()
{
if(send(s,ptr_sendarray,5,0)<0)
{
logWSAError(cout,"send");
returnfalse;
}
arraysent[0]=sendarray[0];
arraysent[1]=sendarray[1];
arraysent[2]=sendarray[2];
arraysent[3]=sendarray[3];
arraysent[4]=sendarray[4];
returntrue;
}