Professional Documents
Culture Documents
ON
Production Management System For IFCAL
BY
ARUN KUMAR PATTANAYAK
Regd. No: 0931020075
Declaration
72
I hereby declare that the Project entitled “Production Management System For IFCAL”
submitted to the Department of Computer Applications, Institute of Technical Education &
Research, Siksha O Anusandhan University, Bhubaneswar, Odisha in fulfillment for the
award of the degree of Master in Computer Application in session 2009-2012 is an authentic
record of my own work carried out under the guidance of “Ms. Preeti Routray”, “Mr.
Satyabrata Mishra” and that the Project has not previously formed the basis for the award of
any other degree. The report has been prepared in compliance to the guidelines specified by the
University.
Place: Bhubaneswar
Date: 04/05/2012
Signature
ARUN KUMAR PATTANAYAK
0931020075
This is to certify that the above statement made by the candidate is correct to the best of my
knowledge.
Ms.Preeti Routray
Lecturer
Department Of Computer Applications
72
Acknowledgement
I am very grateful to my project guide Ms.Preeti Routray for giving his valuable time and
constructive guidance in preparing the Synopsis/Project. It would not have been possible
to complete this project in short period of time without her kind encouragement and
valuable guidance.
CERTIFICATE
72
This is to certify that this project entitled “Production Management System For IFCAL”
the Department of Computer Applications, Institute of Technical Education and Research, Under
Limited under our guidance and is worthy of acceptance for award of the degree. The work
fulfills the entire requirement as per the regulation of the University and in our opinion it has
72
CONTENTS
72
CHAPTER 1
The main aim is to maintain the Details of Transparency, accuracy and avoid human error .This
system keeps the whole records about board meeting, also generates reports as per the office
requirement. This system has several forms in which a user can enter the details about the
mineral transported.
Manuals system can be eliminated, so pen and paper works are minimized. Less staff
members required as compared to manual system.
The speed, economy and efficiency are high enough to consider it us as a key factor,
which makes out a care for computer.
Application Development for various state departments and public sector undertakings.
72
The scope of the project is:
This project is basically developed for IFCAL which is a subsidiary company of IDCOL.
So, it is used by IFCAL and IDCOL. If any other company has same working procedure
they can use this software.
The project can be used in all the departments of the company under IDCOL organization
such as production, maintenance, order, purchase, supply, security, admin etc.
Different branches of this company can use this.
In the head branch this software is used.
1.1: SYNOPSIS
The project titled as “Production Management System” has been developed under the
software Microsoft Visual Studio 2008 using ASP.Net 3.5 as front end and SQL Server 2005 as
the back end.
The project Production Management System has been developed for IDCOL
Ferrochrome And Alloy Ltd. The main aim is how to simulate the production and dispatch
process. This system keeps the whole records about the entry of raw materials , raw material
inventory, production process etc.. .
This system has several forms in which a user can enter the details about the Production
Process. Each form has proper validations with proper authentication.
72
CHAPTER- 2
INTRODUCTION
IDCOL Software Limited (ISL), a subsidiary of IDCOL and a govt. of Orissa enterprise
is emerging as complete IT Solution Company in the state. It has been set up with the object of
establishing a strong and commercially viable business within the area of information technology
for the betterment of the people of the State of Orissa along with a wish of preparing the young
generation to meet the future requirement in the IT field. It has been given the status of Strategies
Microsoft Technology Partner (SMTP) and System integrator of Compaq. In association with IT
72
majors like Microsoft, Compaq, ISL has envisioned to carry out prestigious projects to bring e-
governance in the state.
ISL today has latest computing facilities in its computer center. It has more than 30
Pentium computers connected as interact. The internet connectivity to the lab facilities instant
access to rich information on the net. Highly qualified research oriented technical staff have
made it possible today ISL as an ideal platform to carry out industry standard training and
software development activities at its center. It has the following facilities:
Development, training server
Web server
72
2.2: PROJECT INTRODUCTION
The project ”PRODUCTION MANAGEMENT SYSTEM” has been developed for
IFCAL Production Department.. This is a web based project with ASP.NET as the front end
tool and SQL server 2005 as the back end tool.
“Production Management System” has several forms that are validated properly,
provide the facilities to add new user, enter and modify the data etc. The entire project is based
on reports, it generates report for each form when successful entry in the form occurred. Here
production entry is provided by user id and password authentication so that unauthenticated user
couldn’t hack the system. So Production management system is a secured and better performance
oriented project.
72
CHAPTER-3:
PROBLEM DEFINITION
72
SYSTEM ARCHITECTURE
Database
72
3.3: FEASIBILITY STUDY:
Feasibility Analysis:
After the problem is cleanly understood and solution are proposed ,then the next
step is to conduct the feasibility study, which is part of system analysis as well as design process.
The main objective of study is to determine whether the proposed system is feasible or not.
Mainly there are 3 types to which the proposed system is subjected as discussed here under:
System analysis bridges the gap between system level requirements and software design.
System analysis allows to refine the software allocation and to build models of the data,
functional and behavioral domains that will be treated by software. The requirement analysis
may be divided into problem recognition, evaluation and synthesis, modeling, specification and
review. System analysis examines the use of data to carry out specific business process within
the scope of the system investigation.
72
The Software that required for this system is as,
1. Windows XP
2. ASP.Net
3. SQL Server 2005
2. ASP .NET:
Six of the more frequently used tools are: Enterprise Manager, Query Analyzer, SQL Profiler,
Service Manager, Data Transformation Services and Books Online.
72
3.3.3 Operational Feasibility
Proposed project are beneficial only if they can be turned into information systems that
will meet the organization’s operating requirements. Simply stated, this test of feasibility asks if
the system will work when it is developed are installed. Are there major barriers to
implementation? Here are question that will help test the operational feasibility of the project:
Is there sufficient support for the project from management or From users? Is the current
is well liked and used to the extent that persons will not be able to see reasons for a change, there
may be resistance.
A. Are current business methods acceptable to the users? If they are not users may
welcome a change that will bring about a more operational and useful system.
B. Have the users have involved in planning and development of the projects? Early
involvement reduces the chances of resistance to the system and change in general
and increases the likelihood of successful projects.
Issue that appears to be relatively minor in the beginning have ways of growing into
major problems after implementation. Therefore all operational aspects must be considered
carefully.
In brief, one of the most important questions that analyzed is whether the operating
staffs, i.e the staff using the systems are ready for a new change, for a new technology?
Otherwise the whole system is a failure ultimately. They are the ones who are going to use this
technology. After operational feasibility analysis, it is found that the staff members and all most
all the members are computer literate. Furthermore they also understood the benefits of using a
computer system are interested for such software to be installed at there end, which shall help to
perform their duties in a systematic manner. This completes the operational feasibility study.
72
3.4 DATA COLLECTION AND ANALYSIS:
The data is collected through different ways. The collection of data is classified into two types.
Primary Data.
Secondary Data.
Primary Data:
Secondary Data:
Interview:
Interviews are used to collect information from the citizens. The data should be collected
through conducting the interviews with the higher officials. It is also conducted through various
departments. Interviews allow us to discover areas of misunderstanding, realistic expectations
and even the indications of resistance to the proposed system.
72
Questionnaires:
Through Questionnaires, the data should be collected from low level to higher level. Using
the collection of that information we have analyze the user benefits. This method is most useful
when analyst need to actually describe how documents are handled, how processes are carried
out and whether specified steps are actually followed.
Seminars:
The data is collected by conducting seminars through various department people. The reliable
information is gathered from various topics of seminars.
Manual Records:
The data is collected from the existing records maintained in the various departments.
Data analysis bridges the gap between system level requirements engineering and software
design. Data analysis allows to refine the software allocation and to build models of the data,
functional and behavioral domains that will be treated by software. The requirement analysis
may be divided into problem recognition, evaluation and synthesis, modeling, specification and
review.
Data flow analysis examines the use of data to carry out specific business process within the
scope of the system investigation. The components of Data flow strategy span both requirements
determination and system design. The tools used in the Data flow analysis are Data Flow
Diagram, Data Dictionary, Data Structure Diagram and Structure Chart.
Data Flow Diagram is the central tool and the basis from which other components are
developed. A Data Dictionary is a catalog of the elements in a system. All the elements
composing the data flowing though a system.
72
CHAPTER-4
SYSTEM SPECIFICATION
The Visual Studio 2008 Web Application Project model uses the same project, build
and compilation semantics as the Visual Studio .NET 2008 web project model:
All files contained within the project are defined within a project file (as well as the
assembly references and other project meta-data settings). Files under the web's file-
system root that are not defined in the project file are not considered part of the web
project.
All code files within the project are compiled into a single assembly that is built and
persisted in the \bin directory on each compile.
The compilation system uses a standard MSBuild based compilation process. This can be
extended and customized using standard MSBuild extensibility rules. You can control the
build through the property pages, for example, name the output assembly or add pre- and
post-build actions.
Because the Web Application Project model uses the same conceptual semantics as the Visual
Studio .NET 2003 web project model, it can make migrating projects much easier—minimizing
code changes. To fully enable Web Application Projects in Visual Studio 2005, you will need to
first install Microsoft Visual Studio 2005 – Update to Support Web Application Projects, then
this add-in. Visual Studio 2005 Web Application Projects are not supported in Visual Web
Developer Express Edition.
72
Microsoft .NET Framework
Programs written for the .NET Framework execute in a software environment that
manages the program's runtime requirements. This runtime environment, which is also a part of
the .NET Framework, is known as the Common Language Runtime (CLR). The CLR provides
the appearance of an application virtual machine so that programmers need not consider the
capabilities of the specific CPU that will execute the program. The CLR also provides other
important services such as production, memory management, and exception handling. The class
library and the CLR together compose the .NET Framework.
The .NET Framework is included with Windows Server 2003, Windows Server 2008
and Windows Vista, and can be installed on some older versions of Windows.
72
Enterprise Manager:
It is the main administrative console for SQL Server installations. It provides
you with a graphical "birds-eye" view of all of the SQL Server installations on your network.
You can perform high-level administrative functions that affect one or more servers, schedule
common maintenance tasks or create and modify the structure of individual databases.
Query Analyzer:
Offers a quick and dirty method for performing queries against any of your SQL
Server databases. It's a great way to quickly pull information out of a database in response to a
user request, test queries before implementing them in other applications, create/modify stored
procedures and execute administrative tasks.
SQL Profiler:
Provides a window into the inner workings of your database. You can monitor
many different event types and observe database performance in real time. SQL Profiler allows
you to capture and replay system "traces" that log various activities. It's a great tool for
optimizing databases with performance issues or troubleshooting particular problems.
Service Manager:
It is used to control the MSSQL Server (the main SQL Server process), MSDTC
(Microsoft Distributed Transaction Coordinator) and SQL Server Agent processes. An icon for
this service normally resides in the system tray of machines running SQL Server. You can use
Service Manager to start, stop or pause any one of these services.
Provide an extremely flexible method for importing and exporting data between a
Microsoft SQL Server installation and a large variety of other formats. The most commonly used
72
DTS application is the "Import and Export Data" wizard found in the SQL Server program
group.
4.2: SOFTWARE DETAILS
In the modern world, computer plays a vital role for storing, maintaining, retrieval and
transformation of information. The important of computerization both in the area of the
administrative applications and automation has been clearly visualization arises due to the
following reasons:
This is the first phase consist of a brief survey of the areas involved and will result in taking the
project in to the next phase. Postponing development for a period or recommending that no
further action is taken. The purpose of the preliminary investigation is to evaluate project result
i.e. what are the benefits the organization will get after the completion of the project. This helps
the management to evaluate the merits of the project, request given by the user and makes an
informed judgment about the feasibility of the proposed system.
When the request is made, the first system activity, the preliminary investigation begins.
The project has gone through the following three steps:
Request clarification
72
Feasibility study
Request approval
Request Clarification
The first step of the system analysis process involves the identification of need. The analyst
(system engineer) meets with the customer and the end user .Identification of need is the starting
point in the evolution of a computer based system. The analyst assists the customer on defining
the goals of the system.
Analyst make sure to distinguish between the “needs” and customers want. Information
gathered during the specified step is specified in a system concept document. The customer
before meeting sometimes prepares the original concept document with the analyst. Invariably,
customer analyst communication results in the modification to the documents.
Feasibility Study:
After the problem is clearly understood and solutions are proposed then the next step is to
conduct the feasibility study, which is part of system analysis as well as design process. The
main objective of study is to determine whether the proposed system is feasible or not.
Request Approval:
Because of the above specified reason, study and appropriate suggestion, the project is approved
for the development process.
System Design
System design is the process of planning a new business system to replace the old. But
72
before this planning can be done, we must thoroughly understand the old system view
and determine how the computer can be used to make its operation most effective.
Figure.1:system design
From a project management point of view, software design is conducted in two steps :
Preliminary Design
Detail Design
Preliminary design is concerned with the transformation of requirement into data and
software architecture. Detail design focuses on refinement to the architectural representation
that lead to detailed data structure and algorithm representation for software .
In this project we have design some important design they are given below:
Data design
Architectural design
Procedural design
Program design
Module design
Documentation design
72
Features of the Common Language Runtime:
The common language runtime manages memory, thread execution, code execution, code safety
verification, compilation, and other system services. These features are intrinsic to the managed
code that runs on the common language runtime. With regards to security, managed components
are awarded varying degrees of trust, depending on a number of factors that include their origin
(such as the Internet, enterprise network, or local computer). This means that a managed
component might or might not be able to perform file-access operations, registry-access
operations, or other sensitive functions, even if it is being used in the same active application.The
runtime enforces code access security. For example, users can trust that an executable embedded
in a Web page can play an animation on screen or sing a song, but cannot access their personal
data, file system, or network. The security features of the runtime thus enable legitimate Internet-
deployed software to be exceptionally feature rich.The runtime also enforces code robustness by
implementing a strict type- and code-verification infrastructure called the common type system
(CTS). The CTS ensures that all managed code is self-describing. The various Microsoft and
third-party language compilers generate managed code that conforms to the CTS. This means
that managed code can consume other managed types and instances, while strictly enforcing type
fidelity and type safety.
In addition, the managed environment of the runtime eliminates many common software issues.
For example, the runtime automatically handles object layout and manages references to objects,
releasing them when they are no longer being used. This automatic memory management
resolves the two most common application errors, memory leaks and invalid memory references.
The runtime also accelerates developer productivity. For example, programmers can write
applications in their development language of choice, yet take full advantage of the runtime, the
72
class library, and components written in other languages by other developers. Any compiler
vendor who chooses to target the runtime can do so. Language compilers that target the .NET
Framework make the features of the .NET Framework available to existing code written in that
language, greatly easing the migration process for existing applications.
While the runtime is designed for the software of the future, it also supports software of today
and yesterday. Interoperability between managed and unmanaged code enables developers to
continue to use necessary COM components and DLLs.
The runtime is designed to enhance performance. Although the common language runtime
provides many standard runtime services, managed code is never interpreted. A feature called
just-in-time (JIT) compiling enables all managed code to run in the native machine language of
the system on which it is executing. Meanwhile, the memory manager removes the possibilities
of fragmented memory and increases memory locality-of-reference to further increase
performance.
C#.NET:
Windows Forms is the new platform for Microsoft Windows application development, based on
the .NET Framework. This framework provides a clear, object-oriented, extensible set of classes
that enable you to develop rich Windows applications. Additionally, Windows Forms can act as
the local user interface in a multi-tier distributed solution. Windows Forms is a framework for
building Windows client applications that utilize the common language runtime. Windows
Forms applications can be written in any language that the common language runtime supports.
72
What Is a Form?
A form is a bit of screen real estate, usually rectangular, that you can use to present information
to the user and to accept input from the user. Forms can be standard windows, multiple document
interface (MDI) windows, dialog boxes, or display surfaces for graphical routines. The easiest
way to define the user interface for a form is to place controls on its surface. Forms are objects
that expose properties which define their appearance, methods which define their behavior, and
events which define their interaction with the user. By setting the properties of the form and
writing code to respond to its events, you customize the object to meet the requirements of your
application.
As with all objects in the .NET Framework, forms are instances of classes. The form you create
with the Windows Forms Designer is a class, and when you display an instance of the form at
run time, this class is the template used to create the form. The framework also allows you to
inherit from existing forms to add functionality or modify existing behavior. When you add a
form to your project, you can choose whether it inherits from the Form class provided by the
framework, or from a form you have previously created.
Additionally, forms are controls, because they inherit from the Control class.
Within a Windows Forms project, the form is the primary vehicle for user interaction. By
combining different sets of controls and writing code, you can elicit information from the user
and respond to it, work with existing stores of data, and query and write back to the file system
and registry on the user's local computer.
Although the form can be created entirely in the Code Editor, it is easier to use the Windows
Forms Designer to create and modify forms.
72
Lower total cost of ownership: Windows Forms takes advantage of the versioning
and deployment features of the common language runtime to offer reduced
deployment costs and higher application robustness over time. This significantly
lowers the maintenance costs (TCO) for applications written in Windows Forms.
Architecture for controls: Windows Forms offers an architecture for controls and
control containers that is based on concrete implementation of the control and
container classes. This significantly reduces control-container interoperability issues.
Security: Windows Forms takes full advantage of the security features of the common
language runtime. This means that Windows Forms can be used implement everything
from an untrusted control running in the browser to a fully trusted application installed
on a user's hard disk.
XML Web services support: Windows Forms offers full support for quickly and
easily connecting to XML Web services.
Rich graphics: Windows Forms is one of the first ship vehicles for GDI+, a new
version of the Windows Graphical Device Interface (GDI) that supports alpha
blending, texture brushes, advanced transforms, rich text support, and more.
Flexible controls: Windows Forms offers a rich set of controls that encompass all of
the controls offered by Windows. These controls also offer new features, such as "flat
look" styles for buttons, radio buttons, and check boxes.
Data awareness: Windows Forms offers full support for the ADO data model.
ActiveX control support: Windows Forms offers full support for ActiveX controls.
You can easily host ActiveX controls in a Windows Forms application. You can also
host a Windows Forms control as an ActiveX control.
Licensing: Windows Forms takes advantage of the common language runtime
enhanced licensing model.
Printing: Windows Forms offers a printing framework that enables applications to
provide comprehensive reports.
72
Design-time support: Windows Forms takes full advantage of the meta-data and
component model features offered by the common language runtime to provide
thorough design-time support for both control users and control implementers.
ADO.NET :
ADO.NET is an evolution of the ADO data access model that directly addresses user
requirements for developing scalable applications. It was designed specifically for the web with
scalability, statelessness, and XML in mind.
ADO.NET uses some ADO objects, such as the Connection and Command objects, and also
introduces new objects. Key new ADO.NET objects include the DataSet, DataReader, and
DataAdapter.
A DataAdapter is the object that connects to the database to fill the DataSet. Then, it connects
back to the database to update the data there, based on operations performed while the DataSet
held the data. In the past, data processing has been primarily connection-based. Now, in an effort
to make multi-tiered apps more efficient, data processing is turning to a message-based approach
that revolves around chunks of information. At the center of this approach is the DataAdapter,
which provides a bridge to retrieve and save data between a DataSet and its source data store. It
accomplishes this by means of requests to the appropriate SQL commands made against the data
store.
The XML-based DataSet object provides a consistent programming model that works with all
models of data storage: flat, relational, and hierarchical. It does this by having no 'knowledge' of
the source of its data, and by representing the data that it holds as collections and data types. No
matter what the source of the data within the DataSet is, it is manipulated through the same set of
standard APIs exposed through the DataSet and its subordinate objects.
While the DataSet has no knowledge of the source of its data, the managed provider has detailed
72
and specific information. The role of the managed provider is to connect, fill, and persist the
DataSet to and from data stores. The OLE DB and SQL Server .NET Data Providers
(System.Data.OleDb and System.Data.SqlClient) that are part of the .Net Framework provide
four basic objects: the Command, Connection, DataReader and DataAdapter. In the remaining
sections of this document, we'll walk through each part of the DataSet and the OLE DB/SQL
Server .NET Data Providers explaining what they are, and how to program against them.
The following sections will introduce you to some objects that have evolved, and some that are
new. These objects are:
When dealing with connections to a database, there are two different options: SQL Server .NET
Data Provider (System.Data.SqlClient) and OLE DB .NET Data Provider (System.Data.OleDb).
In these samples we will use the SQL Server .NET Data Provider. These are written to talk
directly to Microsoft SQL Server. The OLE DB .NET Data Provider is used to talk to any OLE
DB provider (as it uses OLE DB underneath).
Connections
Connections are used to 'talk to' databases, and are respresented by provider-specific classes such
as SQLConnection. Commands travel over connections and resultsets are returned in the form of
streams which can be read by a DataReader object, or pushed into a DataSet object.
72
Commands
Commands contain the information that is submitted to a database, and are represented by
provider-specific classes such as SQLCommand. A command can be a stored procedure call, an
UPDATE statement, or a statement that returns results. You can also use input and output
parameters, and return values as part of your command syntax. The example below shows how
to issue an INSERT statement against the Northwind database.
DataReaders
DataSets
The DataSet object is similar to the ADO Recordset object, but more powerful, and with one
other important distinction: the DataSet is always disconnected. The DataSet object represents a
cache of data, with database-like structures such as tables, columns, relationships, and
constraints. However, though a DataSet can and does behave much like a database, it is
important to remember that DataSet objects do not interact directly with databases, or other
source data. This allows the developer to work with a programming model that is always
consistent, regardless of where the source data resides. Data coming from a database, an XML
file, from code, or user input can all be placed into DataSet objects. Then, as changes are made to
the DataSet they can be tracked and verified before updating the source data. The GetChanges
method of the DataSet object actually creates a second DatSet that contains only the changes to
the data. This DataSet is then used by a DataAdapter (or other objects) to update the original data
72
source.
The DataSet has many XML characteristics, including the ability to produce and consume XML
data and XML schemas. XML schemas can be used to describe schemas interchanged via
WebServices. In fact, a DataSet with a schema can actually be compiled for type safety and
statement completion.
DataAdapters (OLEDB/SQL)
The DataAdapter object works as a bridge between the DataSet and the source data. Using the
provider-specific SqlDataAdapter (along with its associated SqlCommand and SqlConnection)
can increase overall performance when working with a Microsoft SQL Server databases. For
other OLE DB-supported databases, you would use the OleDbDataAdapter object and its
associated OleDbCommand and OleDbConnection objects.
The DataAdapter object uses commands to update the data source after changes have been made
to the DataSet. Using the Fill method of the DataAdapter calls the SELECT command; using the
Update method calls the INSERT, UPDATE or DELETE command for each changed row. You
can explicitly set these commands in order to control the statements used at runtime to resolve
changes, including the use of stored procedures. For ad-hoc scenarios, a CommandBuilder object
can generate these at run-time based upon a select statement. However, this run-time generation
requires an extra round-trip to the server in order to gather required metadata, so explicitly
providing the INSERT, UPDATE, and DELETE commands at design time will result in better
run-time performance.
72
5. Remember, you can execute a command directly against the database in order to do
inserts, updates, and deletes. You don't need to first put data into a DataSet in order to
insert, update, or delete it.
Also, you can use a DataSet to bind to the data, move through the data, and navigate
Books Online:
Is an often overlooked resource provided with SQL Server that contains answers to a
variety of administrative, development and installation issues. It's a great resource to consult
before turning to the Internet or technical support.
The hardware specification of the system in which the project has been developed are
presented below
Microprocessor : Intel Pentium IV
System Bus Speed : 533 MHz
Processor Speed : 2.0 GHz
Main Memory : 256 MB
Secondary Memory : 40 GB
Secondary Memory Speed : 266 MHz
72
CHAPTER-5
SYSTEM PLANNING
System Planning is a method for analyzing, defining and designing an information architecture
of organizations. Later it was made available to customers and this method became an important
tool for many organizations. It is a very complex method dealing with data, processes, strategies,
aims and organizational departments which are interconnected. System Planning brings new
approach to design an information architecture.
Planning for a system has a time horizon and a focus dimension. The time horizon dimension
specifies the time range of the plan, where as the focus dimension relates whether the primary
concern is strategic, managerial, or operational. The system i.e. the project that we are assigned
was required to complete within 25 weeks. What we have plan are as follows:
72
CHAPTER-6
SYSTEM DESIGN
6.1:PROJECT DESCRIPTION:
PRODUCTION:
Production module has two modes i.e. Admin mode & User mode and generates two very
important Reports.
Admin Mode contains the following sub-modules for :-
Maintaining the details of the production process and gives the privilege to modify the
details at times if necessary.
Maintaining the details of Various Departments.
Department mode having the following sub modules which is generally used to
show various reports that can be only accessible by the admin.
This mode contains the following sub-modules:-
Form Details:
The function of this sub module is to give details about the opening and closing of the
tap hole of previous day.
Shutdown Report:
The function of this sub-module is to give details report of the shutdown. The user can
view the details of total shutdown hour of a particular session and also reason for the
72
shutdown(Electrode Broken, Water Leakage Sparking, Slipping, Power failure etc).It shows all
the details month wise with a specific reason.
Technical Data:
This report shows all information of the item used in the furnace during the
production process, what is the running hour of that process ,how much power is consumed
during that particular production process, how much Briquette, Sponge iron,coal,coke
consumed, and also percentage of fixed carbon, V.M in reductant.
72
This mode contains the following sub modules:-
Form Details:
This gives the details about previous data. Which can be used for reference purpose only.
Shifting Details:
The function of this sub module is to record the exact details of tap hole opening time,
closing ,KHW reading at tap, voltage at a particular time.
Raw material Details:
The function of this sub module is to use of raw materials and also for maintain the stock .
Furnace Details:
The function of this module is to maintain record of furnace bottom temperature, slipping of
electrode, water pressure of the furnace etc.
Last slipping:
This page gives the details about last electrode sleeping if occurred.
Production Summary:
This sub module shows the details of production. such as total paste charge used, total power
consumption etc.
Raw materials:
This sub module is used to keep the records of all raw materials and also maintains the stock.
Shutdown and Production:
This sub module is used to store the details about stoppage details and also generates the report
monthly.
This sub module keeps entire production process and also contents of material in HCFC also
generates reports.
Jigging:
This sub module comes after production process to convert the HCFC into various forms i.e. to
small chips or to powder format which are ready for sales and despatch.
Loading schedule:
72
The main function of this module is to maintain record of raw material loading, such as loading
briquette, loading chromites, loading quartzite, loading reductant etc into the furnace. This
module have several sub module:
Feeding of Chromites:
The function of this sub module is to fetch data of chromate i.e. percentage of Cr2O3,FeO,its
source, blending ratio then maintain record of all data.
Feeding of Briquette:
The function of this sub module is to fetch data of Briquette i.e percentage of Cr2O3,FeO,its
source, blending ratio then maintain record of all datas.
Feeding Reductant:
The function of this sub module is to retrieve data related to reductant from the database i.e.
percentage of ash,vm,fc,p and calculating the blending ratio then maintain record of all datas.
Feeding Quartzite:
The function of this sub module is to retrieve data related to Quartzite from the database i.e
percentage of sio2 and calculating the blending ratio then records all the data.
Despatch Clearance:
The function of this module is to keep record of raw material dispatch process, such as it keeps
record of name of the party, date of dispatch, quantity of raw material dispatched, percentage of
Cr ,C ,Si ,and P in its size etc.
72
Chemical Laboratory:
The main function of this sub module is to keep record of percentage of other element in the raw
material. Here all the raw material are tested then according to the test result the datas are
recorded to their specific table.
This module has following sub module:
Coke Analysis:
It record percentage of other element such as moisture ,vm, sul ,phos ,ash in the coke.
Analysis Briquette:
It record percentage of other element such as sio2 ,al2o3, fe, cr2o3, cao, mgo, phosphorous in the
briquette.
Analysis Quartzite:
It record percentage of other element such as sio2, al2o3, fe, cao, mgo in the quartzite.
Molasses Analysis:
It record percentage of other element such as density, c3s, so3 in the molasses.
Consolidated Analysis:
Stock of FPH: It will maintain the stock of finished product after production. Also it will
generate reports.
The primary need at the outset of design is the database. An important requirement in
the design is the representation of data in different tables. The data items for tables are classified
based on that characteristic. The relationships between the data items are identified in each table.
Certain data that are unlikely to be changed and those that would facilities the
operation of other processes are kept in the master tables. Similarly data that form a part of
transaction are put into the transaction table.
72
Thus, looking at the different data available, tables are classified as master table,
transaction table etc and the database design is carefully done aiming to achieve its main
objective as:
Data Integration.
Data Independence.
Normalization:
Normalization is the process of analyzing the given relation schemas based on their
Functional Dependencies and primary keys to achieve the desirable properties of
Minimizing redundancy
Minimizing the insertion, deletion and update anomalies
Normalization is carried out for the following reasons:
To structure the data so that perfect relationship between entities can be represented.
To permit simple retrieval of data in response query and report requests.
To reduce the need to restructure or reorganize data when new application requirements
arises.
72
A table (relation) is in 2NF if it is in First Normal Form and if all non key attributes are
dependent on the key.
2NF is sometimes phrased as, A table is in 2NF if it is in 1NF and it has no partial
dependencies.
72
This figure represent the whole table in the project
Registration Tables:
72
It shows the registration table of admin
72
Material information:
72
It shows the employee name ,id and dept_id
Department Table:
72
Employee table
72
Designation table:
72
A dataflow diagram is a graphical representation that depicts information flow and the
transform that are applied as data move from input and output. The basic forms of a Data Flow
Diagram also move from known as a Data Flow Graph or a Bubble Chart.
The Data Flow Diagram may be used to represent a system or software at any level of
abstraction. Data Flow Diagram may be partitioned into levels that represent increasing
information flow and functional details. Therefore the data Flow Diagram provides a mechanism
for functional modeling as well as information modeling.
A level of DFD, also called a fundamental system model or a context model represents the
entire software element as a single bubble with input and output data indicated by incoming and
outgoing arrows respectively. Additional processes and information flow paths are represented as
the level 0 DFD is partitioned to reveal more detail. For example, a level 1 DFD might contain
five or six bubbles with interconnecting arrows. Each of the processes represented at level 1 is a
sub function of the overall system depicted in the context model.
A fundamental model for system flow indicates the primary input is A and ultimate output
is B. the basic notation used to develop a DFD is not in itself sufficient to describe requirements
for software
Each component of a DFD is labeled with a descriptive name. Process names are
further numbered that will be used for identification purposes. The number assigned to a specific
process does not correspond to the sequence of processes. It is strictly for identification purposes.
72
A data flow diagram allows parallel activities i.e. a number of data-flows coming out
from the source and going into the destination. A DFD concentrates on the data moving through
the system and not on the devices or equipments. A DFD may consist of a number of levels. The
top-level diagram is called the Context Diagram, which consists of a single process and plays a
very important
Role in studying the system. It gives the most general and broadest view of the system.
Move over it gives the pictorial representation of the scope boundaries of the system under study.
DFD SYMBOLS:
Symbol Description
Data store
A repository of data that is to be stored
For use by one or more proces
72
6.5:DFD DIAGRAM:-
72
Level1 -> DFD DIAGRAM
72
72
LEVEL-2 -> DFD DIAGRAM
72
72
72
6.5: PROCESS FLOW DIAGRAM
ER-diagram:
72
Admin use case diagram:
72
Production Use case diagram:
72
Purchase Use case Diagram:
72
Stock use case diagram:
72
Lab Use case Diagram:
72
Class Diagram:
72
Production sequence diagram:
72
Sale sequence diagram:
72
Stock sequence diagram:
72
Lab sequence diagram:
72
CHAPTER-7:
Testing is a process of creating of creating a program with the explicit intention of finding
error that is making the program fail. Successful test then, is one that finds an as yet
undiscovered error. As an additional benefit, testing demonstrates that software functions appear
to be working to the specifications.
PURPOSE OF TESTING :
UNIT TESTING:
72
Instead of testing the system as a whole, Unit Testing focuses on the modules that make
up the system. Each module is taken up individually and tested for correctness i8n coding and
logic. Error resulting from interaction of module is initially avoided. The advantages of Unit
Testing are:
Size of module is quite small, so error can be easily located, confusing interaction of multiple
errors in widely different parts of the software eliminated.
Module level Testing can be exhaustive.
INTEGRATION TESTING:
It tests for the errors resulting from integration of modules. One specific target of
Integration Testing is the interface, whether parameters match on both sides as to type
permissible ranges and meaning. Analyst tries to find areas where modules have been design
with different specification for data length, data types and data element name. It’s a black box
testing method.
Implementation is the process of having system personnel check out and put new
equipments in use, training users install the new application and construct any file or the
database needed to use it. Depending on the size of the organization that will be involved in
using the application and the risk associated with the use, the developers may choose to pilot the
operation in only one area of the firm. Regardless of the strategy of the implementation,
developers strike that the systems initial user is trouble free.
Once installed applications are often used for years. However both the organization and
the users will be changing and the environment will be different over weeks and months.
Therefore the production system has to maintain. Modification and changes will be made to the
software, database and procedures to meet emerging user requirements.
System Implementation consists of System coding, System Testing
Coding:
72
Coding for the software has done in ASP.Net (C#) and SQL server 2005.
CHAPTER 8
Software Development has many phases. These phases include Requirements Engineering,
Architecting, Design, Implementation, Testing, Software Deployment and maintenance.
Maintenance is the last stage of the software life cycle. After the product has been released, the
maintenance phase keeps the software up to date with environment changes and changing user
requirements. Maintenance can only happen efficiently if the earlier phases are done properly.
There are four major problems that can slow down the maintenance process: unstructured code,
maintenance programmers having insufficient knowledge of the system, documentation being
absent, out of date, or at best insufficient, and software maintenance having a bad image. The
success of the maintenance phase relies on these problems being fixed earlier in the life cycle.
Maintenance consists of four parts. Corrective maintenance deals with fixing bugs in the
code. Adaptive maintenance deals with adapting the software to new environments. Perfective
maintenance deals with updating the software according to changes in user requirements. Finally,
preventive maintenance deals with updating documentation and making the software more
maintainable. All changes to the system can be characterized by these four types of maintenance.
Corrective maintenance is ‘traditional maintenance’ while the other types are considered as
‘software evolution.’
72
CHAPTER 9
Cost–benefit analysis is a systematic process for calculating and comparing benefits and costs of
a project, decision or government policy. Cost and benefit analysis has two purposes:
It involves comparing the total expected cost of each option against the total expected benefits, to
see whether the benefits outweigh the costs, and by how much.
Benefits and costs are expressed in money terms, and are adjusted for the time value of money,
so that all flows of benefits and flows of project costs over time are expressed on a common
basis in terms of their "net present value."
Cost and benefit attempts to measure the positive or negative consequences of a project.
The cost estimates provided here excluded the costs of hardware and software.
72
Regardless of the aim, all benefit-cost analyses have several properties in common. A
cost and benefit analysis begins with a problem to be solved. Without a doubt, results from a cost
and benefit analysis can be used to raise the level of public debate surrounding a project.
An Overview of UML
72
Relationships in the uml:
There are four kinds of relationships in the UML:
1. Dependency
2. Association
3. Generalization
4. Realization
Use case is a description of a set of sequence of actions that a system performs that yields an
observable result of value to a particular things in a model graphically. USE Case diagram is one
type of uml diagram which represents all the use cases of the system, actor ant the relationship
between them. Use Case diagrams are one of the five diagrams in the UML for modeling the
dynamic aspects of systems(activity diagrams, sequence diagrams, state chart diagrams and
collaboration diagrams are the four other kinds of diagrams in the UML for modeling the
dynamic aspects of systems). Use Case diagrams are central to modeling the behavior of the
system, a sub-system, or a class. Each one shows a set of use cases and actors and relationships.
Common Properties:
A Use Case diagram is just a special kind of diagram and shares the same common properties, as
do all other diagrams- a name and graphical contents that are a projection into the model. What
distinguishes a use case diagram from all other kinds of diagrams is its particular content.
Contents
Use Case diagrams commonly contain:
Use Cases
Actors
Dependency, generalization, and association relationships
The actors of the system are admin, user, contractor and security. The use cases of the system
are authentication, edit profile,providevehicle,maintain fire fighting, appoint contractor and
security, create challan,report generation, worker creation and keep entry and exit details.
72
CHAPTER-10
SCREEN SHOTS
Login page:
72
Registration page:
72
Admin home page:
72
Production home page:
72
Stock home page:
72
User registration page:
72
Testing:-
System testing in the style of implementation, which aimed at ensuring that the system
work at all levels and is effective before live operations starts. The system test should be definite
confirmation that all are correct and an opportunity to show the users that the system works.
Software Testing is the critical element of software Quality assurance and it represents
the ultimate review of specification, design and coding. If testing is done successfully, it will
uncover errors in the software.
CONCLUSION
72
This software provides a user friendly approach towards the system. This system has been well
developed and when implemented, is bound to satisfy all the requirements. Painstaking efforts
have been taken to make the software impeccable and upgradeable.
This system enables to perform better and more efficient work , it speeds up all the
activities and provides good communication amongst all department.
This software will provide a user friendly approach towards the organization.
This system will be well developed and when implemented , it is bound to satisfy all the
user requirements.
There is a hope that this software will be utilized to its maximum and will do a good job
in the long run.
BIBLIOGRAPHY
72
Black Book by Richard wobson
ASP .NET by Bible
Complete Reference by Mc.Donald
Goggle search
Web references
www.cooltext.com
www.ifcal.com
72