You are on page 1of 12

Introduction

This project is aimed at developing a desktop application for the Training and Placement Dept.
of the college. The system is a desktop based application that can be accessed throughout the
organization and outside as well with proper login provided.
This system can be used as
1) An application for the TPO of the college to manage the student information with regards to
placement.
2) An application which can be used by the Organizations/Company’s for managing interview
process
3) Any job seeker who would like to keep record of company he/she has applied

Project Specifications

Name Placement Management System

Project description Desktop Application


 Desktop based application that can be
Project expedient accessed throughout the organization
 To manage the student information
with regards to placement.
 Can be used by the
Organizations/Company’s for
managing interview process

Project Objective To provide “Placement Solutions” to the


user as well as Organizations
Project Type Java, Applets, Swings

Team Size 1
 J2SE
Learning’s Required  MS ACCESS
 SQL
 GUI

Scope
The project has a wide scope which can be implemented in diversified areas. We can store
information of all the employees. Various companies can access the information. Employees can
maintain their information and can update it.
FEASIBILITY STUDY
RECOGNITION OF NEED:-
For the purpose of training and placement of the student in colleges, TPO’s/ORGANIZATION
have to collect the information of students and manages them manually and arranges them
according to various streams.
If any modification is required that is to be also done manually. So, to reduce the job
required to manage the information of various recruiters, a new system is
proposed which is processed through computers.
PRESENT WORKING SYSTEM:-
• In Various colleges, training and placement officers have to manage the CV’s and
documents of students for their training and placement manually.
• TPO’s have to collect the information of various companies who want to recruit
students and notify students time to time about them.
• TPO’s have to arrange CV’s of students according to various streams and notify
them according to company requirements.
• If any modifications or updations are required in CV of any student, it has to
searched and to be done it manually.
BOTTLENECKS OF PRESENT WORKING SYSTEM:-
• The Job of TPO is a unique task that involves taking into considerations many
features for the same. Existing system has some bottlenecks looked upon by
TPO’s and students of colleges.
• Size of collection of CV’s may be very large. To handle such a large collection of
CV’s is a great overhead.
• It is very over heading task to arrange CV’s according to various streams, match
them with the companies requirement.

PROPOSED SOLUTIONS:-
To develop a system that would recomplete the following:-
_ Reduce the paperwork and storage area.
_ Improve the output of operators.
_ Improve accuracy in result.
_ Manage the man and machine resources efficiently.
_ It has user friendly interface having quick authenticated access to documents.
_ Easily scalable to grow with changing system requirement.
_ Secured check in, check out & updates.
Low Level Design
After finishing the database design, the next step was the architectural design. The method
chosen for low-level detail was to draw the DFD (Data Flow Diagram). A data-flow diagram
(DFD) is a graphical representation of the "flow" of data through an information system. DFDs
can also be used for the visualization of data processing (structured design). On a DFD, data
items flow from an external data source or an internal data store to an internal data store or an
external data sink, via an internal process. Data flow diagrams can be used to provide a clear
representation of any business function. The technique starts with an overall picture of the
business and continues by analyzing each of the functional areas of interest. This analysis can be
carried out to precisely the level of detail required. The technique exploits a method called top-
down expansion to conduct the analysis in a targeted way. Typically Data flow diagrams are
used to describe how the system transforms information. They define how information is
processed and stored and identify how the information flows through the processes.
A DFD provides no information about the timing of processes, or about whether processes will
operate in sequence or in parallel. It is therefore quite different from a flowchart, which shows
the flow of control through an algorithm, allowing a reader to determine what operations will be
performed, in what order, and under what circumstances, but not what kinds of data will be input
to and output from the system, nor where the data will come from and go to, nor where the data
will be stored (all of which are shown on a DFD).
As a user is isolated from other users, he/she can only log in or log out of the system at will.
Once logged in, he/she can stay in or log out the system. Three different types of users, so we
have three logged in states. If log in failed, a user will stay in the not logged in state until he/she
passes the account verification.
Note: 1. User name and password does not match, stay in the not logged in state.
2. User name and password matches for employer, go to the logged in as EMPLOYER
user state.
3. User name and password matches for employee, go to the logged in as EMPLOYEE
state.
4. User name and password matches, go to the logged in as Employee state.
5. Logged out from Employee user state.
6. Logged out from Employer.
7. Logged out from Developer.
Fig 22 DFD 0 Level

The level 1 diagram identifies the major business processes at a high level and any of these
processes can then be analyzed further - giving rise to a corresponding level 2 business process
diagram. This process of more detailed analysis can then continue – through level 3, 4 and so on.
However, most investigations will stop at level 2 and it is very unusual to go beyond a level 3
diagrams.
A level 1 diagram follows this; which provides an overview of the major functional areas of the
business.

Testing

7.1. Introduction
Software testing is any activity aimed at evaluating an attribute or capability of a program or
system and determining that it meets its required results. Although crucial to software quality
and widely deployed by programmers and testers, software testing still remains an art, due to
limited understanding of the principles of software. The difficulty in software testing stems from
the complexity of software: we cannot completely test a program with moderate complexity.
Testing is more than just debugging. The purpose of testing can be quality assurance, verification
and validation, or reliability estimation. Testing can be used as a generic metric as well.
Correctness testing and reliability testing are two major areas of testing. Software testing is a
trade-off between budget, time and quality.
It is an investigation conducted to provide stakeholders with information about the quality of the
product or service under test. Software testing also provides an objective, independent view of
the software to allow the business to appreciate and understand the risks at implementation of the
software. Test techniques include, but are not limited to, the process of executing a program or
application with the intent of finding software bugs.
Software testing can also be stated as the process of validating and verifying that a software
program/application/product:
1. Meets the business and technical requirements that guided its design and development;
2. Works as expected; and
3. Can be implemented with the same characteristics.
Software testing, depending on the testing method employed, can be implemented at any time in
the development process. However, most of the test effort occurs after the requirements have
been defined and the coding process has been completed. As such, the methodology of the test is
governed by the software development methodology adopted.
Different software development models will focus the test effort at different points in the
development process. Newer development models, such as Agile, often employ test driven
development and place an increased portion of the testing in the hands of the developer, before it
reaches to formal team of testers.
Testing cannot establish that a product functions properly under all conditions but can only
establish that it does not function properly under specific conditions. The scope of software
testing often includes examination of code as well as execution of that code in various
environments and conditions as well as examining the aspects of code: does it do what it is
supposed to do and do what it needs to do. In the current culture of software development, a
testing organization may be separate from the development team. There are various roles for
testing team members. Information derived from software testing may be used to correct the
process by which software is developed.

7.2. Defects and failures


Not all software defects are caused by coding errors. One common source of expensive defects is
caused by requirement gaps, e.g., unrecognized requirements that result in errors of omission by
the program designer. A common source of requirements gaps is non-functional requirements
such as testability, scalability, maintainability, performance and security.
Software faults occur through the following processes. A programmer makes an error (mistake)
which results in a defect (fault, bug) in the software source code. If this defect is executed, in
certain situations the system will produce wrong results, causing a failure. Not all defects will
necessarily result in failures. For example, defects in dead code will never result in failures. A
defect can turn into a failure when the environment is changed. Examples of these changes in
environment include the software being run on a new hardware platform, alterations in source
code or interacting with different software. A single defect may result in a wide range of failure
symptoms.
7.3. Compatibility
A common cause of software failure (real or perceived) is a lack of compatibility with other
application software, operating system (or operating system versions, old or new), or target
environments that differ greatly from the original (such as a terminator GUI application intended
to be run on the desktop now being required to become a web application, which must render in
a web browser). For example, in the case of a lack of background compatibility, this can occur
because the programmers develop and test software only on the latest version of the target
environment, which not all users may be running. This result in the unintended consequence that
the latest work may not function on earlier versions of the target environment or on older
hardware those earlier versions of the target environment was capable of using. Sometimes such
issues can be fixed by proactively abstracting operating system functionality into a separate
program module or library.

7.4. Software verification and validation


It is an investigation conducted to provide stakeholders with information about the quality of the
product or service under test. Software testing also provides an objective, independent view of
the software to allow the business to appreciate and understand the risks at implementation of the
software. Test techniques include, but are not limited to, the process of executing a program or
application with the intent of finding software bugs.

Software testing is used in association with verification and validation:


i. Verification: Have we built the software right? (i.e., does it match the specification)
ii.Validation: Have we built the right software? (i.e., is this what the customer wants)

The terms verification and validation are commonly used interchangeably in the industry; it is
also common to see these two terms incorrectly defined. According to the IEEE Standard
Glossary of Software Engineering Terminology:
Verification is the process of evaluating a system or component to determine whether the
products of a given development phase satisfy the conditions imposed at the start of that phase.
Validation is the process of evaluating a system or component during or at the end of the
development process to determine whether it satisfies specified requirements.

7.4.1 The software testing team


Software testing can be done by software testers. Until the 1980s the term "software tester" was
used generally, but later it was also seen as a separate profession. Regarding the periods and the
different goals in software testing, different roles have been established: manager, test lead, test
designer, tester, automation developer, and test administrator.

7.4.2 Input combinations and preconditions


A very fundamental problem with software testing is that testing under all combinations of
inputs and preconditions (initial state) is not feasible, even with a simple product. This means
that the number of defects in a software product can be very large and defects that occur
infrequently are difficult to find in testing. More significantly, non-functional dimensions of
quality (how it is supposed to be versus what it is supposed to do)—— scalability, performance,
reliability can be highly subjective; something that constitutes sufficient value to one person may
be intolerable to another.
Software bugs will almost always exist in any software module with moderate size: not because
programmers are careless or irresponsible, but because the complexity of software is generally
intractable -- and humans have only limited ability to manage complexity. It is also true that for
any complex systems, design defects can never be completely ruled out.
Software testing is a critical element of software quality assurance and represents the ultimate
review of specification, design and coding. For best testing result, there should be a third group
responsible for testing. As the project is still under development phase, the Development team
took care of the testing part too.

7.5. Development Process Model


Testing process is always bundled with developing process. For the on-line application like Self
Deception, prototype model was used during the development process.
It begins with requirements gathering. A prior meeting was held for gathering information about
the overall requirements. Thereafter process of making “quick design” was started. The quick
design focuses on a representation of those aspects of the software that will be visible to the user,
it leads to the construction of a prototype. Ideally, the prototype serves as a mechanism for
identifying software requirements. At the beginning, the product behavior wasn’t known and
what it should exactly look like. After building several prototypes of it, it was finally decided
what kind of interface is exactly required. After each version was built, process of initial testing
began, which further gave the team suggestions and requirements to improving the application.

7.5.1 Testing Plan


Techniques White-box testing and black box testing were used for software Validation &
Verification. White-box testing is predicated on close examination of procedural detail. Black
box testing is used to demonstrate that software functions are operational. For the white-box
testing, team followed basis path testing. From the flow chart, we figured all the paths.

Fig 28 Unit Testing

Notes:
1: user log in
2: if password and user name matches
Then 3: get the user’s status from database
Else 4: display error message
Go back to the log in page
End if
5: case user’s status (Selection)
Selection 6: display corresponding interface
Selection 7: display corresponding interface
Selection 8: display corresponding interface
End case
8e: Checking Jobs
9: log out
10: kill session cookies and redirect to initial log in page

Not all the testing works are done at last minute. Some of them are parallel with your
coding process. We test each function separately. Each function in the software is like a
module, which is the smallest unit of the software. Basically, all the functions that user
can use are linked to the interface. Once the link has been clicked, the corresponding
function will be called. The interface is acting as the driver for the unit testing, and
there is no stub needed in this case.
Because the units in this project are not based on each other, so no integration testing
was applied.
Finally, alpha test was used, which means recording of the usage problems and errors.
Then those errors are corrected and solved the usage problems. Then we did the same
process again and again until the final version.
The process can be described by using the figure below:
Fig 29 Developing and Testing Process

The security issue is mainly focused on the database access privilege. The MySQL
server is located on the server. To access it, you must pass the user account and
password verification. The Development team will be using a function for connecting to
the database, which by convention is far more secure. From the user interface, once the
user passes the password and user account verification, then he/she can only access the
database according to his or her privileges. User may or may not log out as it will not
effect the state of database. The time user closes the browser or navigate the log out
function automatically get called; killing all the user’s cookies for his/her session.

Stress testing was done on the application to check the extent of the application. Stress
tests are designed to confront programs with abnormal situations. When the application
was tested, user was checked whether they can send multiple friend request as well as
set more than one mood send compliments to one or more friends. By doing that, we
were able to all the system functions. But in the real world, this is not the case.
When I tested the robustness of the product, the team tried to violate the size limit
definition of the attribute in the table of the project management database. For example,
the definition of the size of ename attribute from employee table is varchar (25), the
input provided to ename with a length of 30. But no error messages generated. MySQL
just truncated it to fit. This may cause trouble. Because the user does not know the
length limit, he/she will think the input has been accepted. But when later he/she uses
the ename to log in, the system will not allow him/her to log in. The solution is to limit
the input size from the interface. For instance, if the definition of fname attribute from
user table is varchar (20), then the size of the input text from the interface is limited to
20. So it will not cause trouble anymore. For all the textareas, the definition in the
MySQL database is LONGTEXT, it can hold up to 4294970000 bytes. That should be
enough even for a project final report.

You might also like