You are on page 1of 81

PROJECT REPORT

ON
Production Management System For IFCAL

SUBMITTED IN PARTIAL FULFILLMENT FOR AWARD OF


DEGREE
IN
MASTER IN COMPUTER APPLICATION
(BATCH 2009-2012)

BY
ARUN KUMAR PATTANAYAK
Regd. No: 0931020075

UNDER THE ESTEEMED GUIDANCE OF

Ms. Preeti Routray

Department of Computer Applications


Institute of Technical Education and Research

Declaration

72
I hereby declare that the Project entitled “Production Management System For IFCAL”
submitted to the Department of Computer Applications, Institute of Technical Education &
Research, Siksha O Anusandhan University, Bhubaneswar, Odisha in fulfillment for the
award of the degree of Master in Computer Application in session 2009-2012 is an authentic
record of my own work carried out under the guidance of “Ms. Preeti Routray”, “Mr.
Satyabrata Mishra” and that the Project has not previously formed the basis for the award of
any other degree. The report has been prepared in compliance to the guidelines specified by the
University.

Place: Bhubaneswar
Date: 04/05/2012

Signature
ARUN KUMAR PATTANAYAK
0931020075

This is to certify that the above statement made by the candidate is correct to the best of my
knowledge.

Ms.Preeti Routray
Lecturer
Department Of Computer Applications

72
Acknowledgement

I am very grateful to my project guide Ms.Preeti Routray for giving his valuable time and
constructive guidance in preparing the Synopsis/Project. It would not have been possible
to complete this project in short period of time without her kind encouragement and
valuable guidance.

Arun Kumar Pattanayak


04/05/2012

CERTIFICATE

72
This is to certify that this project entitled “Production Management System For IFCAL”

submitted in partial fulfillment of the degree of MASTER IN COMPUTER APPLICATION to

the Department of Computer Applications, Institute of Technical Education and Research, Under

Siksha O Anusandhan University, Bhubaneswar, Odisha, done by Mr Arun Kumar Pattanayak

Registration No.0931020075 is an authentic work carried out by him at IDCOL Software

Limited under our guidance and is worthy of acceptance for award of the degree. The work

fulfills the entire requirement as per the regulation of the University and in our opinion it has

reached the standard needed for submission.

Signature of the Internal Guide


Ms. Preeti Rout Ray
Lecturer EXAMINER
Department Of Computer Applications

Dr. Debahuti Mishra


HEAD OF DEPARTMENT
Computer Applications
Institute of Technical Education and Research

72
CONTENTS

1.OBJECTIVE AND SCOPE Page Number


1.1: SYNOPSIS
2.INTRODUCTION
2.1 Organization Profile…………………………………………………………. 10
2.2 Project Introduction…………………………………………………………. 12
3. SYSTEM ANALYSIS.
3.1 Existing System…………………………………………………………….. 13
3.2 Proposed system……………………………………………………………. 13
3.3 Feasibility Study…………………………………………………………… 15
3.3.1 Technical
3.3.2 Economical
3.3.3 Operational
3.4 Data Collection and Analysis………………………………………………... 18
3.5 Data Analysis and Interpretation……………………………………………. 19
4. SYSTEM SPECIFICATION
4.1 About Software……………………………………………………………... 20
4.2 SOFTWARE DETAILS………………………………………………………. 23
4.2 Software Specification ……………………………………………………. 34
4.3 Hardware Specification……………………………………………………. 34
5. SYSTEM PLANNING 35
6. SYSTEM DESIGN.
6.1 Project Description………………………………………………………… 36
6.2 Database Design…………………………………………………………… 40
6.3 Database Tables……………………………………………………………. 42
6.4 Dataflow Diagram…………………………………………………………. 48
6.5 Process Flow Diagram……………………………………………………. 56
7. TESTING AND IMPLEMENTATION
7.1 System Testing……………………………………………………………. 66
7.2 System Implementation…………………………………………………… 67
8. SYSTEM MAINTAINANCE 68
9. COST AND BEBEFIT ANALYSIS 69
10. SCREEN LAYOUTS………………………………………………………………… 72
CONCLUSION………………………………………………………………………….. 80
BIBLIOGRAPHY……………………………………………………………………….. 81

72
CHAPTER 1

OBJECTIVE & SCOPE OF THE PROJECT

The main aim is to maintain the Details of Transparency, accuracy and avoid human error .This
system keeps the whole records about board meeting, also generates reports as per the office
requirement. This system has several forms in which a user can enter the details about the
mineral transported.

The basic objectives of the project are,

 Manuals system can be eliminated, so pen and paper works are minimized. Less staff
members required as compared to manual system.

 Huge amount of data can be stored.

 The speed, economy and efficiency are high enough to consider it us as a key factor,
which makes out a care for computer.

 To provide accuracy, reducing processing time.

 To provide user-friendly system.

 Processing and generating different types of reports.

 Application Development for various state departments and public sector undertakings.

 Software distribution and implementation.

 Avoid Human error.

72
The scope of the project is:

 This project is basically developed for IFCAL which is a subsidiary company of IDCOL.
So, it is used by IFCAL and IDCOL. If any other company has same working procedure
they can use this software.
 The project can be used in all the departments of the company under IDCOL organization
such as production, maintenance, order, purchase, supply, security, admin etc.
 Different branches of this company can use this.
 In the head branch this software is used.

1.1: SYNOPSIS

The project titled as “Production Management System” has been developed under the
software Microsoft Visual Studio 2008 using ASP.Net 3.5 as front end and SQL Server 2005 as
the back end.
The project Production Management System has been developed for IDCOL
Ferrochrome And Alloy Ltd. The main aim is how to simulate the production and dispatch
process. This system keeps the whole records about the entry of raw materials , raw material
inventory, production process etc.. .
This system has several forms in which a user can enter the details about the Production
Process. Each form has proper validations with proper authentication.

72
CHAPTER- 2

INTRODUCTION

2.1: ORGANISATION PROFILE:

“THE INDUSTRIAL DEVELOPMENT CORPORATION OF ORISSA


LIMITED (IDCOL)” was incorporated on 29thMarch, 1962 as a wholly owned Govt. Of Orissa
undertaking with an authorized capital of Rs.50 crores. It has its registered office at IDCOL
House, Unit-II ,Ashok Nagar, Bhubaneswar- 751001,Khurda,Orissa,India,PhoneNo.(0674-
2532848,Fax:0674-2530518, web: WWW.idcoorissa.com).The main objectives of the
Corporation are to establish, promote and execute heavy and medium scale industries within the
state of Orissa and to aid, assist and finance industrial undertaking with capital for the execution
of its works/business which is like to promote industrial development and accelerate the pace of
industrial growth of the state. The present authorized share capital of the Corporation is Rs.75
crores and paid up capital is Rs.75.12 crores. Now IDCOL is undergoing a reform process
keeping in pace with the present market environment and Government’s economic policy.
Almost all the subsidiary companies have already been disinvested: IDCOL Cement Limited
(ICL), IDCOL Rolling Mill (IRML), Hirakud Industrial Works Limited (HIWL).
IDCOL SOFTWARE LIMITED (ISL)

IDCOL Software Limited (ISL), a subsidiary of IDCOL and a govt. of Orissa enterprise
is emerging as complete IT Solution Company in the state. It has been set up with the object of
establishing a strong and commercially viable business within the area of information technology
for the betterment of the people of the State of Orissa along with a wish of preparing the young
generation to meet the future requirement in the IT field. It has been given the status of Strategies
Microsoft Technology Partner (SMTP) and System integrator of Compaq. In association with IT

72
majors like Microsoft, Compaq, ISL has envisioned to carry out prestigious projects to bring e-
governance in the state.

Some of the major projects are:


 File tracking system at Secretariat
 Integrating Block level activities in the state
 Land record designation projects
 Loan Reconciliation system of OSFC
 Loan accounting system of OSFC
 Financial Accounting System of IDCOL
 Dealer management system for IDCOL Cement Ltd.
Objectives:
 Application Development for various state departments and public sector
undertakings
 Providing system integration services
 Imparting hi-end software training
 Software distribution and implementation.
Capabilities:

ISL today has latest computing facilities in its computer center. It has more than 30
Pentium computers connected as interact. The internet connectivity to the lab facilities instant
access to rich information on the net. Highly qualified research oriented technical staff have
made it possible today ISL as an ideal platform to carry out industry standard training and
software development activities at its center. It has the following facilities:
 Development, training server
 Web server

72
2.2: PROJECT INTRODUCTION
The project ”PRODUCTION MANAGEMENT SYSTEM” has been developed for
IFCAL Production Department.. This is a web based project with ASP.NET as the front end
tool and SQL server 2005 as the back end tool.

“Production Management System” has several forms that are validated properly,
provide the facilities to add new user, enter and modify the data etc. The entire project is based
on reports, it generates report for each form when successful entry in the form occurred. Here
production entry is provided by user id and password authentication so that unauthenticated user
couldn’t hack the system. So Production management system is a secured and better performance
oriented project.

72
CHAPTER-3:

PROBLEM DEFINITION

3.1 EXISTING SYSTEM:

The existing System of production department is maintained manually. All the


documentation and reporting are done by individual person causes wastage of time and man
power. In the Organization it is always required to follow the previous year’s records for
estimating the budget and it is difficult enough for a person to maintain the records, go through
the previous records and to do the calculation manually for the budget. Human error may occur
during manual verification & documentation of records. Above all the production cannot be
maintained in a manual system.

3.2 PROPOSED SYSTEM:

The proposed system “Production Management System” is a web supported


computerized system, which allows inserting of all necessary data to the specified fields.
According to the production region all data and inserted values are secured. Existing data can’t
be altered without referencing to administrator. As it is web based, it can be accessed any where
in the network but data are secured because these are inserted with log in process. So any
message can be passed to all the respective offices in India which is allowed to display when site
is open. So, by the administrator through login procedure the final report of monthly or annually
whatever required can be printed at any place at any point of time.

72
SYSTEM ARCHITECTURE

Client (Browser) Web Server ActiveX Data


IIS Object (ADO)

Database

72
3.3: FEASIBILITY STUDY:

Feasibility Analysis:

After the problem is cleanly understood and solution are proposed ,then the next
step is to conduct the feasibility study, which is part of system analysis as well as design process.
The main objective of study is to determine whether the proposed system is feasible or not.
Mainly there are 3 types to which the proposed system is subjected as discussed here under:

System Analysis is a process by which we attribute process or goals to a human


activity, determine how well those purpose are being achieved and specify the requirements of
the various tools and techniques that are to be used within the system if the system performances
are to be achieved.

System analysis bridges the gap between system level requirements and software design.
System analysis allows to refine the software allocation and to build models of the data,
functional and behavioral domains that will be treated by software. The requirement analysis
may be divided into problem recognition, evaluation and synthesis, modeling, specification and
review. System analysis examines the use of data to carry out specific business process within
the scope of the system investigation.

Analysis means a detailed explanation or description. Before computerizing a system


under consideration, it has to be analyzed. We need to study how it functions currently, what is
the problem and what are the requirements that the proposed system should meet.

The main components of making software are:

1. System and software requirements analysis


2. Design and implementation of software.
3. Ensuring, verifying and maintaining software integrity
3.3.1 Technical Feasibility:
A study of resource availability that may affect the ability to achieve n acceptable
system. Technical feasibility deals with hardware as well as software requirements. If the
necessary requirements are made available with the system then the proposed system is said
to be technically feasible.

72
The Software that required for this system is as,
1. Windows XP
2. ASP.Net
3. SQL Server 2005

2. ASP .NET:

Microsoft .NET Framework is a software component that is a part of Microsoft


Windows operating systems. It has a large library of pre-coded solutions to common
programming problems and manages the execution of programs written specifically for the
framework. The .NET Framework is a key Microsoft offering and is intended to be used by most
new applications created for the Windows platform.

3.SQL Server 2005:

Microsoft SQL Server 2005 is a full-featured relational database management system


(RDBMS) that offers a variety of administrative tools to ease the burdens of database
development, maintenance and administration.

Six of the more frequently used tools are: Enterprise Manager, Query Analyzer, SQL Profiler,
Service Manager, Data Transformation Services and Books Online.

3.3.2 Economical Feasibility


Economic justification is generally the “bottom-Line” Consideration for the system.
Economic justification includes a broad range of IFCAL concerns that includes cost benefits
analysis. Even though an initial investment has to be made on the software and hardware aspects.
The proposed system aims at processing of transaction effectively .thus saving money
and since the existing system takes a lot of time and money.
The proposed system is efficient economically. Even though the initial investment of the
proposed system developed in high, this can be over come by gradually using the system.

72
3.3.3 Operational Feasibility
Proposed project are beneficial only if they can be turned into information systems that
will meet the organization’s operating requirements. Simply stated, this test of feasibility asks if
the system will work when it is developed are installed. Are there major barriers to
implementation? Here are question that will help test the operational feasibility of the project:
Is there sufficient support for the project from management or From users? Is the current
is well liked and used to the extent that persons will not be able to see reasons for a change, there
may be resistance.

A. Are current business methods acceptable to the users? If they are not users may
welcome a change that will bring about a more operational and useful system.

B. Have the users have involved in planning and development of the projects? Early
involvement reduces the chances of resistance to the system and change in general
and increases the likelihood of successful projects.
Issue that appears to be relatively minor in the beginning have ways of growing into
major problems after implementation. Therefore all operational aspects must be considered
carefully.
In brief, one of the most important questions that analyzed is whether the operating
staffs, i.e the staff using the systems are ready for a new change, for a new technology?
Otherwise the whole system is a failure ultimately. They are the ones who are going to use this
technology. After operational feasibility analysis, it is found that the staff members and all most
all the members are computer literate. Furthermore they also understood the benefits of using a
computer system are interested for such software to be installed at there end, which shall help to
perform their duties in a systematic manner. This completes the operational feasibility study.

72
3.4 DATA COLLECTION AND ANALYSIS:

The data is collected through different ways. The collection of data is classified into two types.
 Primary Data.
 Secondary Data.
Primary Data:

 The data should be collected through consumers.


 The information is gathered from the IFCAL Department Officers.
 The data is collected from the existing system.
 The information is collected from the plant.
 The additional details collected through Internet.

Secondary Data:

The secondary data was collected from the following categories,


 Interview
 Questionnaires
 Seminars
 Manual Records

Interview:

Interviews are used to collect information from the citizens. The data should be collected
through conducting the interviews with the higher officials. It is also conducted through various
departments. Interviews allow us to discover areas of misunderstanding, realistic expectations
and even the indications of resistance to the proposed system.

72
Questionnaires:
Through Questionnaires, the data should be collected from low level to higher level. Using
the collection of that information we have analyze the user benefits. This method is most useful
when analyst need to actually describe how documents are handled, how processes are carried
out and whether specified steps are actually followed.
Seminars:
The data is collected by conducting seminars through various department people. The reliable
information is gathered from various topics of seminars.
Manual Records:
The data is collected from the existing records maintained in the various departments.

3.5 DATA ANALYSIS AND INTERPRETATION:

Data analysis bridges the gap between system level requirements engineering and software
design. Data analysis allows to refine the software allocation and to build models of the data,
functional and behavioral domains that will be treated by software. The requirement analysis
may be divided into problem recognition, evaluation and synthesis, modeling, specification and
review.
Data flow analysis examines the use of data to carry out specific business process within the
scope of the system investigation. The components of Data flow strategy span both requirements
determination and system design. The tools used in the Data flow analysis are Data Flow
Diagram, Data Dictionary, Data Structure Diagram and Structure Chart.
Data Flow Diagram is the central tool and the basis from which other components are
developed. A Data Dictionary is a catalog of the elements in a system. All the elements
composing the data flowing though a system.

72
CHAPTER-4

SYSTEM SPECIFICATION

4.1: ABOUT SOFTWARE

The Visual Studio 2008 Web Application Project model uses the same project, build
and compilation semantics as the Visual Studio .NET 2008 web project model:

 All files contained within the project are defined within a project file (as well as the
assembly references and other project meta-data settings). Files under the web's file-
system root that are not defined in the project file are not considered part of the web
project.
 All code files within the project are compiled into a single assembly that is built and
persisted in the \bin directory on each compile.
 The compilation system uses a standard MSBuild based compilation process. This can be
extended and customized using standard MSBuild extensibility rules. You can control the
build through the property pages, for example, name the output assembly or add pre- and
post-build actions.

Because the Web Application Project model uses the same conceptual semantics as the Visual
Studio .NET 2003 web project model, it can make migrating projects much easier—minimizing
code changes. To fully enable Web Application Projects in Visual Studio 2005, you will need to
first install Microsoft Visual Studio 2005 – Update to Support Web Application Projects, then
this add-in. Visual Studio 2005 Web Application Projects are not supported in Visual Web
Developer Express Edition.

72
Microsoft .NET Framework

It is a software component that is a part of Microsoft Windows operating systems.


It has a large library of pre-coded solutions to common programming problems and manages the
execution of programs written specifically for the framework. The .NET Framework is a key
Microsoft offering and is intended to be used by most new applications created for the Windows
platform. The pre-coded solutions that form the framework's Base Class Library cover a large
range of programming needs in areas including user interface, data access, database connectivity,
cryptography, web application development, numeric algorithms, and network communications.
The class library is used by programmers who combine it with their own code to produce
applications.

Programs written for the .NET Framework execute in a software environment that
manages the program's runtime requirements. This runtime environment, which is also a part of
the .NET Framework, is known as the Common Language Runtime (CLR). The CLR provides
the appearance of an application virtual machine so that programmers need not consider the
capabilities of the specific CPU that will execute the program. The CLR also provides other
important services such as production, memory management, and exception handling. The class
library and the CLR together compose the .NET Framework.

The .NET Framework is included with Windows Server 2003, Windows Server 2008
and Windows Vista, and can be installed on some older versions of Windows.

Microsoft SQL Server


Microsoft SQL Server 2000 is a full-featured relational database management
system (RDBMS) that offers a variety of administrative tools to ease the burdens of database
development, maintenance and administration. In this article, we'll cover six of the more
frequently used tools: Enterprise Manager, Query Analyzer, SQL Profiler, Service Manager,
Data Transformation Services and Books Online.

72
Enterprise Manager:
It is the main administrative console for SQL Server installations. It provides
you with a graphical "birds-eye" view of all of the SQL Server installations on your network.
You can perform high-level administrative functions that affect one or more servers, schedule
common maintenance tasks or create and modify the structure of individual databases.

Query Analyzer:
Offers a quick and dirty method for performing queries against any of your SQL
Server databases. It's a great way to quickly pull information out of a database in response to a
user request, test queries before implementing them in other applications, create/modify stored
procedures and execute administrative tasks.

SQL Profiler:
Provides a window into the inner workings of your database. You can monitor
many different event types and observe database performance in real time. SQL Profiler allows
you to capture and replay system "traces" that log various activities. It's a great tool for
optimizing databases with performance issues or troubleshooting particular problems.

Service Manager:

It is used to control the MSSQL Server (the main SQL Server process), MSDTC
(Microsoft Distributed Transaction Coordinator) and SQL Server Agent processes. An icon for
this service normally resides in the system tray of machines running SQL Server. You can use
Service Manager to start, stop or pause any one of these services.

Data Transformation Services (DTS):

Provide an extremely flexible method for importing and exporting data between a
Microsoft SQL Server installation and a large variety of other formats. The most commonly used

72
DTS application is the "Import and Export Data" wizard found in the SQL Server program
group.
4.2: SOFTWARE DETAILS

4.2.1: Identification of the need:

In the modern world, computer plays a vital role for storing, maintaining, retrieval and
transformation of information. The important of computerization both in the area of the
administrative applications and automation has been clearly visualization arises due to the
following reasons:

 Keeping track of high volume of data.


 Acquiring, consolidating, storing, retrieving, and using the data in multi-location work
place.
 For the introducing modern technology.
 Ensuring optimum level of performance.
 Ensuring good quality of services.
 Reducing clerical work and avoiding duplicity of data.
 To make readily available accurate information for the various level of authority for
analysis and decision-making.

4.2.2: Preliminary Investigation:

This is the first phase consist of a brief survey of the areas involved and will result in taking the
project in to the next phase. Postponing development for a period or recommending that no
further action is taken. The purpose of the preliminary investigation is to evaluate project result
i.e. what are the benefits the organization will get after the completion of the project. This helps
the management to evaluate the merits of the project, request given by the user and makes an
informed judgment about the feasibility of the proposed system.

When the request is made, the first system activity, the preliminary investigation begins.
The project has gone through the following three steps:

 Request clarification

72
 Feasibility study
 Request approval

Request Clarification

The first step of the system analysis process involves the identification of need. The analyst
(system engineer) meets with the customer and the end user .Identification of need is the starting
point in the evolution of a computer based system. The analyst assists the customer on defining
the goals of the system.

 What information will be produced?


 What information is to be provided?
 What functions and performance are required?

Analyst make sure to distinguish between the “needs” and customers want. Information
gathered during the specified step is specified in a system concept document. The customer
before meeting sometimes prepares the original concept document with the analyst. Invariably,
customer analyst communication results in the modification to the documents.

Feasibility Study:

After the problem is clearly understood and solutions are proposed then the next step is to
conduct the feasibility study, which is part of system analysis as well as design process. The
main objective of study is to determine whether the proposed system is feasible or not.

It may be technical feasibility, economic feasibility and operational feasibility.

Request Approval:

Because of the above specified reason, study and appropriate suggestion, the project is approved
for the development process.

System Design

System design is the process of planning a new business system to replace the old. But

72
before this planning can be done, we must thoroughly understand the old system view
and determine how the computer can be used to make its operation most effective.

Old System Planning New System

Figure.1:system design

From a project management point of view, software design is conducted in two steps :

 Preliminary Design

 Detail Design

Preliminary design is concerned with the transformation of requirement into data and
software architecture. Detail design focuses on refinement to the architectural representation
that lead to detailed data structure and algorithm representation for software .

In this project we have design some important design they are given below:

 Data design

 Architectural design

 Procedural design

 Form design or Graphical design

 Program design

 Module design

 Documentation design

72
Features of the Common Language Runtime:

The common language runtime manages memory, thread execution, code execution, code safety
verification, compilation, and other system services. These features are intrinsic to the managed
code that runs on the common language runtime. With regards to security, managed components
are awarded varying degrees of trust, depending on a number of factors that include their origin
(such as the Internet, enterprise network, or local computer). This means that a managed
component might or might not be able to perform file-access operations, registry-access
operations, or other sensitive functions, even if it is being used in the same active application.The
runtime enforces code access security. For example, users can trust that an executable embedded
in a Web page can play an animation on screen or sing a song, but cannot access their personal
data, file system, or network. The security features of the runtime thus enable legitimate Internet-
deployed software to be exceptionally feature rich.The runtime also enforces code robustness by
implementing a strict type- and code-verification infrastructure called the common type system
(CTS). The CTS ensures that all managed code is self-describing. The various Microsoft and
third-party language compilers generate managed code that conforms to the CTS. This means
that managed code can consume other managed types and instances, while strictly enforcing type
fidelity and type safety.

In addition, the managed environment of the runtime eliminates many common software issues.
For example, the runtime automatically handles object layout and manages references to objects,
releasing them when they are no longer being used. This automatic memory management
resolves the two most common application errors, memory leaks and invalid memory references.

The runtime also accelerates developer productivity. For example, programmers can write
applications in their development language of choice, yet take full advantage of the runtime, the

72
class library, and components written in other languages by other developers. Any compiler
vendor who chooses to target the runtime can do so. Language compilers that target the .NET
Framework make the features of the .NET Framework available to existing code written in that
language, greatly easing the migration process for existing applications.

While the runtime is designed for the software of the future, it also supports software of today
and yesterday. Interoperability between managed and unmanaged code enables developers to
continue to use necessary COM components and DLLs.

The runtime is designed to enhance performance. Although the common language runtime
provides many standard runtime services, managed code is never interpreted. A feature called
just-in-time (JIT) compiling enables all managed code to run in the native machine language of
the system on which it is executing. Meanwhile, the memory manager removes the possibilities
of fragmented memory and increases memory locality-of-reference to further increase
performance.

Finally, the runtime can be hosted by high-performance, server-side applications, such as


Microsoft® SQL Server™ and Internet Information Services (IIS). This infrastructure enables
you to use managed code to write your business logic, while still enjoying the superior
performance of the industry's best enterprise servers that support runtime hosting.

C#.NET:

Windows Forms is the new platform for Microsoft Windows application development, based on
the .NET Framework. This framework provides a clear, object-oriented, extensible set of classes
that enable you to develop rich Windows applications. Additionally, Windows Forms can act as
the local user interface in a multi-tier distributed solution. Windows Forms is a framework for
building Windows client applications that utilize the common language runtime. Windows
Forms applications can be written in any language that the common language runtime supports.

72
What Is a Form?

A form is a bit of screen real estate, usually rectangular, that you can use to present information
to the user and to accept input from the user. Forms can be standard windows, multiple document
interface (MDI) windows, dialog boxes, or display surfaces for graphical routines. The easiest
way to define the user interface for a form is to place controls on its surface. Forms are objects
that expose properties which define their appearance, methods which define their behavior, and
events which define their interaction with the user. By setting the properties of the form and
writing code to respond to its events, you customize the object to meet the requirements of your
application.

As with all objects in the .NET Framework, forms are instances of classes. The form you create
with the Windows Forms Designer is a class, and when you display an instance of the form at
run time, this class is the template used to create the form. The framework also allows you to
inherit from existing forms to add functionality or modify existing behavior. When you add a
form to your project, you can choose whether it inherits from the Form class provided by the
framework, or from a form you have previously created.

Additionally, forms are controls, because they inherit from the Control class.

Within a Windows Forms project, the form is the primary vehicle for user interaction. By
combining different sets of controls and writing code, you can elicit information from the user
and respond to it, work with existing stores of data, and query and write back to the file system
and registry on the user's local computer.

Although the form can be created entirely in the Code Editor, it is easier to use the Windows
Forms Designer to create and modify forms.

Some of the advantages of using Windows Forms include the following:

 Simplicity and power: Windows Forms is a programming model for developing


Windows applications that combines the simplicity of the Visual Basic 6.0
programming model with the power and flexibility of the common language runtime.

72
 Lower total cost of ownership: Windows Forms takes advantage of the versioning
and deployment features of the common language runtime to offer reduced
deployment costs and higher application robustness over time. This significantly
lowers the maintenance costs (TCO) for applications written in Windows Forms.
 Architecture for controls: Windows Forms offers an architecture for controls and
control containers that is based on concrete implementation of the control and
container classes. This significantly reduces control-container interoperability issues.
 Security: Windows Forms takes full advantage of the security features of the common
language runtime. This means that Windows Forms can be used implement everything
from an untrusted control running in the browser to a fully trusted application installed
on a user's hard disk.
 XML Web services support: Windows Forms offers full support for quickly and
easily connecting to XML Web services.
 Rich graphics: Windows Forms is one of the first ship vehicles for GDI+, a new
version of the Windows Graphical Device Interface (GDI) that supports alpha
blending, texture brushes, advanced transforms, rich text support, and more.
 Flexible controls: Windows Forms offers a rich set of controls that encompass all of
the controls offered by Windows. These controls also offer new features, such as "flat
look" styles for buttons, radio buttons, and check boxes.
 Data awareness: Windows Forms offers full support for the ADO data model.
 ActiveX control support: Windows Forms offers full support for ActiveX controls.
You can easily host ActiveX controls in a Windows Forms application. You can also
host a Windows Forms control as an ActiveX control.
 Licensing: Windows Forms takes advantage of the common language runtime
enhanced licensing model.
 Printing: Windows Forms offers a printing framework that enables applications to
provide comprehensive reports.

 Accessibility: Windows Forms controls implement the interfaces defined by


Microsoft Active Accessibility (MSAA), which make it simple to build applications
that support accessibility aids, such as screen readers.

72
 Design-time support: Windows Forms takes full advantage of the meta-data and
component model features offered by the common language runtime to provide
thorough design-time support for both control users and control implementers.

ADO.NET :

ADO.NET is an evolution of the ADO data access model that directly addresses user
requirements for developing scalable applications. It was designed specifically for the web with
scalability, statelessness, and XML in mind.

ADO.NET uses some ADO objects, such as the Connection and Command objects, and also
introduces new objects. Key new ADO.NET objects include the DataSet, DataReader, and
DataAdapter.

A DataAdapter is the object that connects to the database to fill the DataSet. Then, it connects
back to the database to update the data there, based on operations performed while the DataSet
held the data. In the past, data processing has been primarily connection-based. Now, in an effort
to make multi-tiered apps more efficient, data processing is turning to a message-based approach
that revolves around chunks of information. At the center of this approach is the DataAdapter,
which provides a bridge to retrieve and save data between a DataSet and its source data store. It
accomplishes this by means of requests to the appropriate SQL commands made against the data
store.

The XML-based DataSet object provides a consistent programming model that works with all
models of data storage: flat, relational, and hierarchical. It does this by having no 'knowledge' of
the source of its data, and by representing the data that it holds as collections and data types. No
matter what the source of the data within the DataSet is, it is manipulated through the same set of
standard APIs exposed through the DataSet and its subordinate objects.

While the DataSet has no knowledge of the source of its data, the managed provider has detailed

72
and specific information. The role of the managed provider is to connect, fill, and persist the
DataSet to and from data stores. The OLE DB and SQL Server .NET Data Providers
(System.Data.OleDb and System.Data.SqlClient) that are part of the .Net Framework provide
four basic objects: the Command, Connection, DataReader and DataAdapter. In the remaining
sections of this document, we'll walk through each part of the DataSet and the OLE DB/SQL
Server .NET Data Providers explaining what they are, and how to program against them.

The following sections will introduce you to some objects that have evolved, and some that are
new. These objects are:

 Connections. For connection to and managing transactions against a database.


 Commands. For issuing SQL commands against a database.
 DataReaders. For reading a forward-only stream of data records from a SQL Server
data source.
 DataSets. For storing, remoting and programming against flat data, XML data and
relational data.
 DataAdapters. For pushing data into a DataSet, and reconciling data against a
database.

When dealing with connections to a database, there are two different options: SQL Server .NET
Data Provider (System.Data.SqlClient) and OLE DB .NET Data Provider (System.Data.OleDb).
In these samples we will use the SQL Server .NET Data Provider. These are written to talk
directly to Microsoft SQL Server. The OLE DB .NET Data Provider is used to talk to any OLE
DB provider (as it uses OLE DB underneath).

Connections

Connections are used to 'talk to' databases, and are respresented by provider-specific classes such
as SQLConnection. Commands travel over connections and resultsets are returned in the form of
streams which can be read by a DataReader object, or pushed into a DataSet object.

72
Commands

Commands contain the information that is submitted to a database, and are represented by
provider-specific classes such as SQLCommand. A command can be a stored procedure call, an
UPDATE statement, or a statement that returns results. You can also use input and output
parameters, and return values as part of your command syntax. The example below shows how
to issue an INSERT statement against the Northwind database.

DataReaders

The DataReader object is somewhat synonymous with a read-only/forward-only cursor over


data. The DataReader API supports flat as well as hierarchical data. A DataReader object is
returned after executing a command against a database. The format of the returned DataReader
object is different from a recordset. For example, you might use the DataReader to show the
results of a search list in a web page.

DataSets and DataAdapters

DataSets
The DataSet object is similar to the ADO Recordset object, but more powerful, and with one
other important distinction: the DataSet is always disconnected. The DataSet object represents a
cache of data, with database-like structures such as tables, columns, relationships, and
constraints. However, though a DataSet can and does behave much like a database, it is
important to remember that DataSet objects do not interact directly with databases, or other
source data. This allows the developer to work with a programming model that is always
consistent, regardless of where the source data resides. Data coming from a database, an XML
file, from code, or user input can all be placed into DataSet objects. Then, as changes are made to
the DataSet they can be tracked and verified before updating the source data. The GetChanges
method of the DataSet object actually creates a second DatSet that contains only the changes to
the data. This DataSet is then used by a DataAdapter (or other objects) to update the original data

72
source.

The DataSet has many XML characteristics, including the ability to produce and consume XML
data and XML schemas. XML schemas can be used to describe schemas interchanged via
WebServices. In fact, a DataSet with a schema can actually be compiled for type safety and
statement completion.

DataAdapters (OLEDB/SQL)

The DataAdapter object works as a bridge between the DataSet and the source data. Using the
provider-specific SqlDataAdapter (along with its associated SqlCommand and SqlConnection)
can increase overall performance when working with a Microsoft SQL Server databases. For
other OLE DB-supported databases, you would use the OleDbDataAdapter object and its
associated OleDbCommand and OleDbConnection objects.

The DataAdapter object uses commands to update the data source after changes have been made
to the DataSet. Using the Fill method of the DataAdapter calls the SELECT command; using the
Update method calls the INSERT, UPDATE or DELETE command for each changed row. You
can explicitly set these commands in order to control the statements used at runtime to resolve
changes, including the use of stored procedures. For ad-hoc scenarios, a CommandBuilder object
can generate these at run-time based upon a select statement. However, this run-time generation
requires an extra round-trip to the server in order to gather required metadata, so explicitly
providing the INSERT, UPDATE, and DELETE commands at design time will result in better
run-time performance.

1. ADO.NET is the next evolution of ADO for the .Net Framework.


2. ADO.NET was created with n-Tier, statelessness and XML in the forefront. Two new
objects, the DataSet and DataAdapter, are provided for these scenarios.
3. ADO.NET can be used to get data from a stream, or to store data in a cache for
updates.
4. There is a lot more information about ADO.NET in the documentation.

72
5. Remember, you can execute a command directly against the database in order to do
inserts, updates, and deletes. You don't need to first put data into a DataSet in order to
insert, update, or delete it.

Also, you can use a DataSet to bind to the data, move through the data, and navigate

Books Online:

Is an often overlooked resource provided with SQL Server that contains answers to a
variety of administrative, development and installation issues. It's a great resource to consult
before turning to the Internet or technical support.

4.3: HARDWARE REQUIREMENTS:

The hardware specification of the system in which the project has been developed are
presented below
Microprocessor : Intel Pentium IV
System Bus Speed : 533 MHz
Processor Speed : 2.0 GHz
Main Memory : 256 MB
Secondary Memory : 40 GB
Secondary Memory Speed : 266 MHz

4.4: SOFTWARE REQUIREMENTS

Operating System : Windows XP/2000 Server


Web Server : IIS (Internet Information Services)
Web Site Designing Tool : ASP.Net 3.5v
Language : C#
Data Base : MS-SQL Server 2005

72
CHAPTER-5

SYSTEM PLANNING

System Planning is a method for analyzing, defining and designing an information architecture
of organizations. Later it was made available to customers and this method became an important
tool for many organizations. It is a very complex method dealing with data, processes, strategies,
aims and organizational departments which are interconnected. System Planning brings new
approach to design an information architecture.

Planning for a system has a time horizon and a focus dimension. The time horizon dimension
specifies the time range of the plan, where as the focus dimension relates whether the primary
concern is strategic, managerial, or operational. The system i.e. the project that we are assigned
was required to complete within 25 weeks. What we have plan are as follows:

Requirements analysis, Preliminary Investigation & Information Gathering should be covered


within the 1st and 2nd week. 12 weeks for the design of the system under development. 1 week for
testing & Implementation.

72
CHAPTER-6
SYSTEM DESIGN

6.1:PROJECT DESCRIPTION:

PRODUCTION:

Production module has two modes i.e. Admin mode & User mode and generates two very
important Reports.
Admin Mode contains the following sub-modules for :-
 Maintaining the details of the production process and gives the privilege to modify the
details at times if necessary.
 Maintaining the details of Various Departments.
Department mode having the following sub modules which is generally used to
show various reports that can be only accessible by the admin.
This mode contains the following sub-modules:-

 Form Details:
The function of this sub module is to give details about the opening and closing of the
tap hole of previous day.

 Shutdown Report:
The function of this sub-module is to give details report of the shutdown. The user can
view the details of total shutdown hour of a particular session and also reason for the

72
shutdown(Electrode Broken, Water Leakage Sparking, Slipping, Power failure etc).It shows all
the details month wise with a specific reason.

 Detail Production Report:


This report shows all the detail about the production of HCFC at different furnace and
total production of HCFC till today, target for today, how much HCFC produced and also total
detail of HCFC production.

 Technical Data:
This report shows all information of the item used in the furnace during the
production process, what is the running hour of that process ,how much power is consumed
during that particular production process, how much Briquette, Sponge iron,coal,coke
consumed, and also percentage of fixed carbon, V.M in reductant.

● Report Analysis HCFC and Slag:

This report shows the following reports


● Analysis Report of HCFC:
This report shows percentage of Chromium,carbon,Silicon in HCFC.
● Analysis Report of Slag:
This report shows all the detail of percentage of SiO2, Cr2O3, FeO,
Al2O3, MgO, CaO in Slag.

● Analysis Report Of Despatch:

This report shows details of Despatch (HCFC).


User Mode is a type of view where only a authorized user can acess that page by giving
their valid password and user id on the basis of their department.

72
This mode contains the following sub modules:-
Form Details:
This gives the details about previous data. Which can be used for reference purpose only.
Shifting Details:
The function of this sub module is to record the exact details of tap hole opening time,
closing ,KHW reading at tap, voltage at a particular time.
Raw material Details:
The function of this sub module is to use of raw materials and also for maintain the stock .
Furnace Details:
The function of this module is to maintain record of furnace bottom temperature, slipping of
electrode, water pressure of the furnace etc.
Last slipping:
This page gives the details about last electrode sleeping if occurred.
Production Summary:
This sub module shows the details of production. such as total paste charge used, total power
consumption etc.
Raw materials:
This sub module is used to keep the records of all raw materials and also maintains the stock.
Shutdown and Production:

This sub module is used to store the details about stoppage details and also generates the report
monthly.

Production and Sales:

This sub module keeps entire production process and also contents of material in HCFC also
generates reports.
Jigging:
This sub module comes after production process to convert the HCFC into various forms i.e. to
small chips or to powder format which are ready for sales and despatch.
Loading schedule:

72
The main function of this module is to maintain record of raw material loading, such as loading
briquette, loading chromites, loading quartzite, loading reductant etc into the furnace. This
module have several sub module:

Daily bin position:


The main function of this sub module is to maintain record of percentage of coke, coal and also
screen station position.

Feeding of Chromites:
The function of this sub module is to fetch data of chromate i.e. percentage of Cr2O3,FeO,its
source, blending ratio then maintain record of all data.
Feeding of Briquette:
The function of this sub module is to fetch data of Briquette i.e percentage of Cr2O3,FeO,its
source, blending ratio then maintain record of all datas.
Feeding Reductant:
The function of this sub module is to retrieve data related to reductant from the database i.e.
percentage of ash,vm,fc,p and calculating the blending ratio then maintain record of all datas.
Feeding Quartzite:
The function of this sub module is to retrieve data related to Quartzite from the database i.e
percentage of sio2 and calculating the blending ratio then records all the data.

Despatch Clearance:

The function of this module is to keep record of raw material dispatch process, such as it keeps
record of name of the party, date of dispatch, quantity of raw material dispatched, percentage of
Cr ,C ,Si ,and P in its size etc.

72
Chemical Laboratory:
The main function of this sub module is to keep record of percentage of other element in the raw
material. Here all the raw material are tested then according to the test result the datas are
recorded to their specific table.
This module has following sub module:

Coke Analysis:
It record percentage of other element such as moisture ,vm, sul ,phos ,ash in the coke.
Analysis Briquette:
It record percentage of other element such as sio2 ,al2o3, fe, cr2o3, cao, mgo, phosphorous in the
briquette.

Analysis Quartzite:
It record percentage of other element such as sio2, al2o3, fe, cao, mgo in the quartzite.
Molasses Analysis:
It record percentage of other element such as density, c3s, so3 in the molasses.

Consolidated Analysis:
Stock of FPH: It will maintain the stock of finished product after production. Also it will
generate reports.

6.2: DATABASE DESIGN:

The primary need at the outset of design is the database. An important requirement in
the design is the representation of data in different tables. The data items for tables are classified
based on that characteristic. The relationships between the data items are identified in each table.
Certain data that are unlikely to be changed and those that would facilities the
operation of other processes are kept in the master tables. Similarly data that form a part of
transaction are put into the transaction table.

72
Thus, looking at the different data available, tables are classified as master table,
transaction table etc and the database design is carefully done aiming to achieve its main
objective as:
 Data Integration.
 Data Independence.

Normalization:
Normalization is the process of analyzing the given relation schemas based on their
Functional Dependencies and primary keys to achieve the desirable properties of
 Minimizing redundancy
 Minimizing the insertion, deletion and update anomalies
Normalization is carried out for the following reasons:
 To structure the data so that perfect relationship between entities can be represented.
 To permit simple retrieval of data in response query and report requests.
 To reduce the need to restructure or reorganize data when new application requirements
arises.

Normalization consists of various levels:

First Normal Form (1NF):

A table (relation) is in 1NF if


 There are no duplicate rows in the table
 Each cell is single-valued(i.e., there are no repeated groups or arrays)
 Entries in a column(attribute, field)are of the same kind.

Second Normal Form (2NF):

 Second Normal Form is based on the concept of full functional dependency.

72
 A table (relation) is in 2NF if it is in First Normal Form and if all non key attributes are
dependent on the key.
 2NF is sometimes phrased as, A table is in 2NF if it is in 1NF and it has no partial
dependencies.

Third Normal Form (3NF):

 Third Normal Form is based on the concept of transitive dependency.


 A table (relation) is in 3NF if it is in Second Normal Form and if it has no transitive
dependencies.

6.3: Database Tables:

72
This figure represent the whole table in the project

Registration Tables:

72
It shows the registration table of admin

Consumption Entry Table:

It shows the consumption table

72
Material information:

It shows the material information:


Admin Inbox Table:

It shows the administrator mail box


Employee Table:-

72
It shows the employee name ,id and dept_id

Department Table:

It shows the department table


Department inbox:

72
Employee table

72
Designation table:

6.4: DATAFLOW DIAGRAM

72
A dataflow diagram is a graphical representation that depicts information flow and the
transform that are applied as data move from input and output. The basic forms of a Data Flow
Diagram also move from known as a Data Flow Graph or a Bubble Chart.
The Data Flow Diagram may be used to represent a system or software at any level of
abstraction. Data Flow Diagram may be partitioned into levels that represent increasing
information flow and functional details. Therefore the data Flow Diagram provides a mechanism
for functional modeling as well as information modeling.

A level of DFD, also called a fundamental system model or a context model represents the
entire software element as a single bubble with input and output data indicated by incoming and
outgoing arrows respectively. Additional processes and information flow paths are represented as
the level 0 DFD is partitioned to reveal more detail. For example, a level 1 DFD might contain
five or six bubbles with interconnecting arrows. Each of the processes represented at level 1 is a
sub function of the overall system depicted in the context model.

A fundamental model for system flow indicates the primary input is A and ultimate output
is B. the basic notation used to develop a DFD is not in itself sufficient to describe requirements
for software

Logical Data Flow Diagrams:


The Logical Data Flow Diagrams represent the transformation of the data from input to
output through processing logically and independently of the physical components that may be
associated with the system
Physical Data Flow Diagrams:
The Physical Dataflow Diagrams show the actual implementation and movement of data
between people, departments, and workstations.

Each component of a DFD is labeled with a descriptive name. Process names are
further numbered that will be used for identification purposes. The number assigned to a specific
process does not correspond to the sequence of processes. It is strictly for identification purposes.

72
A data flow diagram allows parallel activities i.e. a number of data-flows coming out
from the source and going into the destination. A DFD concentrates on the data moving through
the system and not on the devices or equipments. A DFD may consist of a number of levels. The
top-level diagram is called the Context Diagram, which consists of a single process and plays a
very important

Role in studying the system. It gives the most general and broadest view of the system.
Move over it gives the pictorial representation of the scope boundaries of the system under study.

The four basic symbols used to construct dataflow diagrams are


 A double square represents a data source or destination.
 A directed line represents the flow of data.
 A circle represents the process that transforms the data.
A open-ended rectangle represents the data storage

DFD SYMBOLS:
Symbol Description

External Entity A procedure or consumer of information that


resides outside the bounds of the block.

It is a transfer of information that resides within the


Process
bound of the system to be modeled

Data item a data item or collection of data items


The arrowhead indicates the direction of data flow.

Data store
A repository of data that is to be stored
For use by one or more proces

72
6.5:DFD DIAGRAM:-

Level-0 : -> DFD DIAGRAM

72
Level1 -> DFD DIAGRAM

72
72
LEVEL-2 -> DFD DIAGRAM

72
72
72
6.5: PROCESS FLOW DIAGRAM

ER-diagram:

72
Admin use case diagram:

72
Production Use case diagram:

72
Purchase Use case Diagram:

Sale use case Diagram:

72
Stock use case diagram:

72
Lab Use case Diagram:

72
Class Diagram:

72
Production sequence diagram:

72
Sale sequence diagram:

Purchase sequence diagram:

72
Stock sequence diagram:

72
Lab sequence diagram:

72
CHAPTER-7:

TESTING AND IMPLEMENTATION

7.1: SYSTEM TESTING

Testing is a process of creating of creating a program with the explicit intention of finding
error that is making the program fail. Successful test then, is one that finds an as yet
undiscovered error. As an additional benefit, testing demonstrates that software functions appear
to be working to the specifications.

PURPOSE OF TESTING :

Testing has several purposes:


 To affirm the quality of the project.
 To find and eliminate any errors from previous stage.
 To validate the software and to eliminate the operational reliability of the
system.
TESTING STRATAGIES:
 Unit Testing.
 Integration Testing.

UNIT TESTING:

72
Instead of testing the system as a whole, Unit Testing focuses on the modules that make
up the system. Each module is taken up individually and tested for correctness i8n coding and
logic. Error resulting from interaction of module is initially avoided. The advantages of Unit
Testing are:
Size of module is quite small, so error can be easily located, confusing interaction of multiple
errors in widely different parts of the software eliminated.
 Module level Testing can be exhaustive.

INTEGRATION TESTING:

It tests for the errors resulting from integration of modules. One specific target of
Integration Testing is the interface, whether parameters match on both sides as to type
permissible ranges and meaning. Analyst tries to find areas where modules have been design
with different specification for data length, data types and data element name. It’s a black box
testing method.

7.2: SYSTEM IMPLEMENTATION:

Implementation is the process of having system personnel check out and put new
equipments in use, training users install the new application and construct any file or the
database needed to use it. Depending on the size of the organization that will be involved in
using the application and the risk associated with the use, the developers may choose to pilot the
operation in only one area of the firm. Regardless of the strategy of the implementation,
developers strike that the systems initial user is trouble free.
Once installed applications are often used for years. However both the organization and
the users will be changing and the environment will be different over weeks and months.
Therefore the production system has to maintain. Modification and changes will be made to the
software, database and procedures to meet emerging user requirements.
System Implementation consists of System coding, System Testing

Coding:

72
Coding for the software has done in ASP.Net (C#) and SQL server 2005.

CHAPTER 8

SYSTEM MAINTENANCE & EVALUATION

Software Development has many phases. These phases include Requirements Engineering,
Architecting, Design, Implementation, Testing, Software Deployment and maintenance.
Maintenance is the last stage of the software life cycle. After the product has been released, the
maintenance phase keeps the software up to date with environment changes and changing user
requirements. Maintenance can only happen efficiently if the earlier phases are done properly.
There are four major problems that can slow down the maintenance process: unstructured code,
maintenance programmers having insufficient knowledge of the system, documentation being
absent, out of date, or at best insufficient, and software maintenance having a bad image. The
success of the maintenance phase relies on these problems being fixed earlier in the life cycle.

Maintenance consists of four parts. Corrective maintenance deals with fixing bugs in the
code. Adaptive maintenance deals with adapting the software to new environments. Perfective
maintenance deals with updating the software according to changes in user requirements. Finally,
preventive maintenance deals with updating documentation and making the software more
maintainable. All changes to the system can be characterized by these four types of maintenance.
Corrective maintenance is ‘traditional maintenance’ while the other types are considered as
‘software evolution.’

72
CHAPTER 9

COST AND BENEFIT ANALYSIS

Cost–benefit analysis is a systematic process for calculating and comparing benefits and costs of
a project, decision or government policy. Cost and benefit analysis has two purposes:

1. To determine if it is a sound investment/decision,

2. To provide a basis for comparing projects.

It involves comparing the total expected cost of each option against the total expected benefits, to
see whether the benefits outweigh the costs, and by how much.

Benefits and costs are expressed in money terms, and are adjusted for the time value of money,
so that all flows of benefits and flows of project costs over time are expressed on a common
basis in terms of their "net present value."

Cost and benefit attempts to measure the positive or negative consequences of a project.
The cost estimates provided here excluded the costs of hardware and software.

Cost and benefit analysis is a technique for evaluating a project or investment by


comparing the economic benefits with the economic costs of the activity. Cost and benefit
analysis has several objectives. First, it can be used to evaluate the economic merit of a project.
Second the results from a series of benefit-cost analyses can be used to compare competing
projects. It can be used to assess business decisions, to examine the worth of public investments,
or to assess the wisdom of using natural resources or altering environmental conditions.
Ultimately, It aims to examine potential actions with the objective of increasing social welfare.

72
Regardless of the aim, all benefit-cost analyses have several properties in common. A
cost and benefit analysis begins with a problem to be solved. Without a doubt, results from a cost
and benefit analysis can be used to raise the level of public debate surrounding a project.

11.3: UML DIAGRAM


UML is a modeling language which is used for modeling the system.UML diagram is the
graphical representation of the system. UNIFIED MODELLING LANGUAGE

An Overview of UML

The UML is a language for


 Visualizing
 Specifying
 Constructing
 Documenting
These are the artifacts of a software-intensive system.
A conceptual model of UML:
The three major elements of UML are
 The UML’s basic building blocks
 The rules that dictate how those building blocks may be put together.
 Some common mechanisms that apply throughout the UML.

Basic building blocks of the UML

The vocabulary of UML encompasses three kinds of building blocks:


 Things
 Relationships
 Diagrams
Things are the abstractions that are first-class citizens in a model;
Relationships tie these things together;
Diagrams group the interesting collection of things.

72
Relationships in the uml:
There are four kinds of relationships in the UML:
1. Dependency
2. Association
3. Generalization
4. Realization

11.3.1: USECASE DIAGRAM

Use case is a description of a set of sequence of actions that a system performs that yields an
observable result of value to a particular things in a model graphically. USE Case diagram is one
type of uml diagram which represents all the use cases of the system, actor ant the relationship
between them. Use Case diagrams are one of the five diagrams in the UML for modeling the
dynamic aspects of systems(activity diagrams, sequence diagrams, state chart diagrams and
collaboration diagrams are the four other kinds of diagrams in the UML for modeling the
dynamic aspects of systems). Use Case diagrams are central to modeling the behavior of the
system, a sub-system, or a class. Each one shows a set of use cases and actors and relationships.

Common Properties:
A Use Case diagram is just a special kind of diagram and shares the same common properties, as
do all other diagrams- a name and graphical contents that are a projection into the model. What
distinguishes a use case diagram from all other kinds of diagrams is its particular content.

Contents
Use Case diagrams commonly contain:
Use Cases
Actors
Dependency, generalization, and association relationships

Here the system is IMTS.

The actors of the system are admin, user, contractor and security. The use cases of the system
are authentication, edit profile,providevehicle,maintain fire fighting, appoint contractor and
security, create challan,report generation, worker creation and keep entry and exit details.

72
CHAPTER-10

SCREEN SHOTS

Login page:

72
Registration page:

Next step for registration:

Last page for admin registration:

72
Admin home page:

Sale home page

72
Production home page:

Purchase home page:

72
Stock home page:

Lab home page:

72
User registration page:

72
Testing:-

System testing in the style of implementation, which aimed at ensuring that the system
work at all levels and is effective before live operations starts. The system test should be definite
confirmation that all are correct and an opportunity to show the users that the system works.

Software Testing is the critical element of software Quality assurance and it represents
the ultimate review of specification, design and coding. If testing is done successfully, it will
uncover errors in the software.

Testing demonstrates that the system functions appeared to be according to the


specifications. Data collected, when testing is conducted provide a good indication of software
reliability.

CONCLUSION

72
This software provides a user friendly approach towards the system. This system has been well
developed and when implemented, is bound to satisfy all the requirements. Painstaking efforts
have been taken to make the software impeccable and upgradeable.
 This system enables to perform better and more efficient work , it speeds up all the
activities and provides good communication amongst all department.
 This software will provide a user friendly approach towards the organization.
 This system will be well developed and when implemented , it is bound to satisfy all the
user requirements.
 There is a hope that this software will be utilized to its maximum and will do a good job
in the long run.

BIBLIOGRAPHY

72
Black Book by Richard wobson
ASP .NET by Bible
Complete Reference by Mc.Donald
Goggle search
Web references
www.cooltext.com
www.ifcal.com

72

You might also like