Professional Documents
Culture Documents
Gym Management
Gym Management
PROJECT – TOPIC
Address :
Semester : VI
Course Title : BCA
Study Centre :
GOVT. TRS COLLEGE, REWA
Date................. Signature..................
.
SELF
CERTIFICATE
SELF CERTIFICATE
the partial fulfillment of the requirements for the award of the degree of B.C.A.
under the guidance of Mr. ASHISH KUMAR SEN The matter embodied in this
project work has not been submitted earlier for award of any Degree or diploma to
RAVI SONI
Roll No.: 1214075
Enrollment No.: ……………….
ACKNOWLEDGEMENT
ACKNOWLEDGEMENT
I owe a great many thanks to a great many people who helped and
supported me during the development of this project.
My deepest thanks to the HOD of BCA Deptt Dr. Anil Tiwari, the Guide of
the project for guiding and correcting various documents of mine with attention
and care. He has taken pain to go through the project and make necessary
correction as and when needed.
I express my thanks to the Principal Honble Dr. A. P. Mishra of, Govt TRS
College, Rewa, or extending his support.
RAVI SONI
Roll No.: 1214074
Enrollment No.: ……………….
DECLARATION
DECLARATION
COMPUTER APPLICATION embodies our own words, except the guidance and
suggestion received during the work which has been suitable acknowledged.
RAVI SONI
Roll No.: 1214075
Enrollment No.: ……………….
PREFACE
PREFACE
The project report has designed specifically to meet the requirement of “GYM
which is the windows based software. The aim of preparing the project is to
software. It has been request effort to develop this project systematically during of a
INTRODUCTION TO PROJECT
OBJECTIVE OF PROJECT
INTRODUCTION TO .NET
SYSTEM ANALYSIS
NORMLIZATION
ER-DIAGRAM
ABOUT MS-ACCESS
OUTPUT
SOURCE CODE
MAINTENANCE
CONCLUSION
BIBLIOGRAPHY
INTRODUCTION
INTRODUCTION
A budget is a financial plan for the future concerning the revenues and costs of a business.
However, a budget is about much more than just financial numbers. Budgetary control is the
process by which financial control is exercised within an organization. Budgets for
income/revenue and expenditure are prepared in advance and then compared with actual
performance to establish any variances.
Managers are responsible for controllable costs within their budgets and are required to take
remedial action if the adverse variances arise and they are considered excessive.
There are many management uses for budgets. For example, budgets are used to:
Whilst there are many uses of budgets, there are a set of guiding principles for good budgetary
control in a business.
TO BE USED
METHODOLOGY TO BE USED
HARDWARE
Computer / Processor
available.
Operating System
recommended).
Peripherals
Network (Optional)
Microsoft TCP/IP
SOFTWARE
In visual basic we will learn how to create visual basic projects, seeing what’s in
such projects, seeing what’s new in visual basic.NET .we can take an overview of essential
visual basic concept such as windows and web forms, controls, events, properties, methods
and so on.
There is a special facility in visual basic and that facility is your IDE(Integrated development
environment) where you can run, debug and compile our programmer. The front page of visual
basic.NET is called IDE.
Visual Basic .Net Professional Feature
When editing program texts the “Intelligence” technology informs you in a little popup
widows about the types of constructs that may be entered t the current cursor location.
VB.NET is a component integration language, which is attuned to Microsoft’s
Component Object Model (“COM”).
Interfaces of COM components cab be easily called remotely via Distributed COM
(“DCOM”), which makes it easy to construct, distributed applications.
COM components can be embedded in/ linked to your application’s user interface and
also in/to stored documentations(Object Linking and Embedding “OLE”, ”Component
Document”)
Nevertheless we are convinced that there remain many problems that have not been addressed
at all by Java, Visual Basic, other language or that haven’t found really satisfactory solution so
far.
Visual Basic .Netis an excellent creator as well as consumer of COM-based components. The
following tools are popular COM hosts:
Visual InterDev
Visual C++
Microsoft Access
Visual FoxPro
Developer 2000
Data Access
All areas of data access have been improved to make it easier to perform your most common
database activities.
Universal Data access, with integrated ADO/OLEDB support.
Visual Database tools integrated into the Visual Basic.Net environment.
New oracle Schema and stored procedure design capabilities.
Data environment Designer for authoring ADO based data access components.
New Integrated Report Writer.
Hierarchical Flex Grid control for displaying hierarchical data.
Ability to create Data Sources.
Ability to easily remote data from machine to machine, tier to tier.
Advanced data binding.
The .NET Framework has two main components: the common language runtime and the .NET
Framework class library. The common language runtime is the foundation of the .NET
Framework. You can think of the runtime as an agent that manages code at execution time,
providing core services such as memory management, thread management, and removing,
while also enforcing strict type safety and other forms of code accuracy that promote security
and robustness. In fact, the concept of code management is a fundamental principle of the
runtime. Code that targets the runtime is known as managed code, while code that does not
target the runtime is known as unmanaged code.
The class library, the other main component of the .NET Framework, is a comprehensive,
object-oriented collection of reusable types that ,We can use to develop applications ranging
from traditional command-line or graphical user interface (GUI) applications to applications
based on the latest innovations provided by ASP.NET, such as Web Forms and XML Web
services.
The .NET Framework can be hosted by unmanaged components that load the common
language runtime into their processes and initiate the execution of managed code, thereby
creating a software environment that can exploit both managed and unmanaged features.
The .NET Framework not only provides several runtime hosts, but also supports the
development of third-party runtime hosts.
For example, ASP.NET hosts the runtime to provide a scalable, server-side environment for
managed code. ASP.NET works directly with the runtime to enable ASP.NET applications
and XML Web services, both of which are discussed later in this topic.
Internet Explorer is an example of an unmanaged application that hosts the runtime (in the
form of a MIME type extension). Using Internet Explorer to host the runtime enables you to
embed managed components or Windows Forms controls in HTML documents. Hosting the
runtime in this way makes managed mobile code (similar to Microsoft® ActiveX® controls)
possible, but with significant improvements that only managed code can offer, such as semi-
trusted execution and isolated file storage.
The following illustration shows the relationship of the common language runtime and the
class library to your applications and to the overall system. The illustration also shows how
managed code operates within a larger architecture.
The following sections describe the main components and features of the .NET Framework in
greater detail.
Features of the Common Language Runtime
The common language runtime manages memory, thread execution, code execution, code
safety verification, compilation, and other system services. These features are intrinsic to the
managed code that runs on the common language runtime.
With regards to security, managed components are awarded varying degrees of trust,
depending on a number of factors that include their origin (such as the Internet, enterprise
network, or local computer). This means that a managed component might or might not be
able to perform file-access operations, registry-access operations, or other sensitive functions,
even if it is being used in the same active application.
The runtime enforces code access security. For example, users can trust that an executable
embedded in a Web page can play an animation on screen or sing a song, but cannot access
their personal data, file system, or network.
The security features of the runtime thus enable legitimate Internet-deployed software to be
exceptionally featuring rich.
that managed code can consume other managed types and instances, while strictly enforcing
type fidelity and type safety.
In addition, the managed environment of the runtime eliminates many common software
issues. For example, the runtime automatically handles object layout and manages references
to objects, releasing them when they are no longer being used. This automatic memory
management resolves the two most common application errors, memory leaks and invalid
memory references.
The runtime also accelerates developer productivity. For example, programmers can write
applications in their development language of choice, yet take full advantage of the runtime,
the class library, and components written in other languages by other developers. Any
compiler vendor who chooses to target the runtime can do so.
Language compilers that target the .NET Framework make the features of the .NET
Framework available to existing code written in that language, greatly easing the migration
process for existing applications.
While the runtime is designed for the software of the future, it also supports software of today
and yesterday. Interoperability between managed and unmanaged code enables developers to
continue to use necessary COM components and DLLs.
The runtime is designed to enhance performance. Although the common language runtime
provides many standard runtime services, managed code is never interpreted. A feature called
just-in-time (JIT) compiling enables all managed code to run in the native machine language
of the system on which it is executing. Meanwhile, the memory manager removes the
possibilities of fragmented memory and increases memory locality-of-reference to further
increase performance.
This infrastructure enables you to use managed code to write your business logic, while still
enjoying the superior performance of the industry's best enterprise servers that support runtime
hosting.
The .NET Framework class library is a collection of reusable types that tightly integrate with
the common language runtime. The class library is object oriented, providing types from
which your own managed code can derive functionality. This not only makes the .NET
Framework types easy to use, but also reduces the time associated with learning new features
of the .NET Framework. In addition, third-party components can integrate seamlessly with
classes in the .NET Framework.
For example, the .NET Framework collection classes implement a set of interfaces that you
can use to develop your own collection classes. Your collection classes will blend seamlessly
with the classes in the .NET Framework.
Console applications.
Windows GUI applications (Windows Forms).
ASP.NET applications.
XML Web services.
Windows services.
For example, the Windows Forms classes are a comprehensive set of reusable types that vastly
simplify Windows GUI development. If you write an ASP.NET Web Form application, you
can use the Web Forms classes.
Another kind of client application is the traditional ActiveX control (now replaced by the
managed Windows Forms control) deployed over the Internet as a Web page. This application
is much like other client applications: it is executed natively, has access to local resources, and
includes graphical elements.
In the past, developers created such applications using C/C++ in conjunction with the
Microsoft Foundation Classes (MFC) or with a rapid application development (RAD)
environment such as Microsoft® Visual Basic®. The .NET Framework incorporates aspects of
these existing products into a single, consistent development environment that drastically
simplifies the development of client applications.
The Windows Forms classes contained in the .NET Framework are designed to be used for
GUI development. You can easily create command windows, buttons,
menus, toolbars, and other screen elements with the flexibility necessary to accommodate
shifting business needs.
For example, the .NET Framework provides simple properties to adjust visual attributes
associated with forms. In some cases the underlying operating system does not support
changing these attributes directly, and in these cases the .NET Framework automatically
recreates the forms.
This is one of many ways in which the .NET Framework integrates the developer interface,
making coding simpler and more consistent.
Unlike ActiveX controls, Windows Forms controls have semi-trusted access to a user's
computer. This means that binary or natively executing code can access some of the resources
on the user's system (such as GUI elements and limited file access) without being able to
access or compromise other resources. Because of code access security, many applications that
once needed to be installed on a user's system can now be deployed through the Web. Your
applications can implement the features of a local application while being deployed like a Web
page.
Server-side applications in the managed world are implemented through runtime hosts.
Unmanaged applications host the common language runtime, which allows your custom
managed code to control the behavior of the server.
This model provides you with all the features of the common language runtime and class
library while gaining the performance and scalability of the host server.
The following illustration shows a basic network schema with managed code running in
different server environments. Servers such as IIS and SQL Server can perform standard
operations while your application logic executes through the managed code.
XML Web services, an important evolution in Web-based technology, are distributed, server-
side application components similar to common Web sites. However, unlike Web-based
applications, XML Web services components have no UI and are not targeted for browsers
such as Internet Explorer and Netscape Navigator. Instead, XML Web services consist of
reusable software components designed to be consumed by other applications, such as
traditional client applications, Web-based applications, or even other XML Web services. As a
result, XML Web services technology is rapidly moving application development and
deployment into the highly distributed environment of the Internet.
If you have used earlier versions of ASP technology, you will immediately notice the
improvements that ASP.NET and Web Forms offer. For example, you can develop Web
Forms pages in any language that supports the .NET Framework.
In addition, your code no longer needs to share the same file with your HTTP text (although it
can continue to do so if you prefer).
Web Forms pages execute in native machine language because, like any other managed
application, they take full advantage of the runtime. In contrast, Unmanaged ASP pages are
always scripted and interpreted. ASP.NET pages are faster, more functional, and easier to
develop than unmanaged ASP pages because they interact with the runtime like any managed
application.The .NET Framework also provides a collection of classes and tools to aid in
development and consumption of XML Web services applications. XML Web services are
built on standards such as SOAP (a remote procedure-call protocol), XML (an extensible data
format), and WSDL ( the Web Services Description Language). The .NET Framework is built
on these standards to promote interoperability with non-Microsoft solutions.
For example, the Web Services Description Language tool included with the .NET Framework
SDK can query an XML Web service published on the Web, parse its WSDL description, and
produce C# or Visual Basic Source code that your application can use to become a client of the
XML Web service.
The source code can create classes derived from classes in the class library that handle all the
underlying communication using SOAP and XML parsing. Although you can use the class
library to consume XML Web services directly, the Web Services Description Language tool
and the other tools contained in the SDK facilitate your development efforts with the .NET
Framework.
If you develop and publish your own XML Web service, the .NET Framework provides a set
of classes that conform to all the underlying communication standards, such as SOAP, WSDL,
and XML. Using those classes enables you to focus on the logic of your service, without
concerning yourself with the communications infrastructure required by distributed software
development.
Finally, like Web Forms pages in the managed environment, your XML Web service will run
with the speed of native machine language using the scalable communication of IIS.
ADO.NET Overview
ADO.NET is an evolution of the ADO data access model that directly addresses user
requirements for developing scalable applications. It was designed specifically for the
web with scalability, statelessness, and XML in mind.
ADO.NET uses some ADO objects, such as the Connection and Command objects,
and also introduces new objects. Key new ADO.NET objects include the DataSet,
DataReader, and DataAdapter.
The important distinction between this evolved stage of ADO.NET and previous data
architectures is that there exists an object -- the DataSet -- that is separate and
distinct from any data stores. Because of that, the DataSet functions as a standalone
entity. You can think of the DataSet as an always disconnected recordset that knows
nothing about the source or destination of the data it contains. Inside a DataSet,
much like in a database, there are tables, columns, relationships, constraints, views,
and so forth.
A DataAdapter is the object that connects to the database to fill the DataSet. Then, it
connects back to the database to update the data there, based on operations
performed while the DataSet held the data. In the past, data processing has been
primarily connection-based. Now, in an effort to make multi-tiered apps more efficient,
data processing is turning to a message-based approach that revolves around chunks
of information. At the center of this approach is the DataAdapter, which provides a
bridge to retrieve and save data between a DataSet and its source data store. It
accomplishes this by means of requests to the appropriate SQL commands made
against the data store.
The XML-based DataSet object provides a consistent programming model that works
with all models of data storage: flat, relational, and hierarchical. It does this by having
no 'knowledge' of the source of its data, and by representing the data that it holds as
collections and data types. No matter what the source of the data within the DataSet
is, it is manipulated through the same set of standard APIs exposed through the
DataSet and its subordinate objects.
While the DataSet has no knowledge of the source of its data, the managed provider
has detailed and specific information. The role of the managed provider is to connect,
fill, and persist the DataSet to and from data stores. The OLE DB and
MSACCESS .NET Data Providers (System.Data.OleDb and System.Data.SqlClient)
that are part of the .Net Framework provide four basic objects: the Command,
Connection, DataReader and DataAdapter. In the remaining sections of this
document, we'll walk through each part of the DataSet and the OLE
DB/MSACCESS .NET Data Providers explaining what they are, and how to program
against them.
The following sections will introduce you to some objects that have evolved, and
some that are new. These objects are:
When dealing with connections to a database, there are two different options: MSACCESS .NET
Data Provider (System.Data.SqlClient) and OLE DB .NET Data Provider (System.Data.OleDb). In
these samples we will use the MSACCESS .NET Data Provider. These are written to talk directly to
Microsoft MSACCESS. The OLE DB .NET Data Provider is used to talk to any OLE DB provider (as
it uses OLE DB underneath).
Connections
Connections are used to 'talk to' databases, and are respresented by provider-specific
classes such as SQLConnection. Commands travel over connections and resultsets
are returned in the form of streams which can be read by a DataReader object, or
pushed into a DataSet object.
Commands
Data Readers
DataSets
The DataSet object is similar to the ADO Recordset object, but more powerful, and
with one other important distinction: the DataSet is always disconnected. The
DataSet object represents a cache of data, with database-like structures such as
tables, columns, relationships, and constraints. However, though a DataSet can and
does behave much like a database, it is important to remember that DataSet objects
do not interact directly with databases, or other source data. This allows the
developer to work with a programming model that is always consistent, regardless of
where the source data resides. Data coming from a database, an XML file, from code,
or user input can all be placed into DataSet objects. Then, as changes are made to
the DataSet they can be tracked and verified before updating the source data. The
GetChanges method of the DataSet object actually creates a second DatSet that
contains only the changes to the data. This DataSet is then used by a DataAdapter
(or other objects) to update the original data source.
The DataSet has many XML characteristics, including the ability to produce and
consume XML data and XML schemas. XML schemas can be used to describe
schemas interchanged via WebServices. In fact, a DataSet with a schema can
actually be compiled for type safety and statement completion.
DataAdapters (OLEDB/SQL)
The DataAdapter object works as a bridge between the DataSet and the source
data. Using the provider-specific SqlDataAdapter (along with its associated
SqlCommand and SqlConnection) can increase overall performance when working
with a Microsoft MSACCESS databases. For other OLE DB-supported databases,
you would use the OleDbDataAdapter object and its associated OleDbCommand
and OleDbConnection objects.
The DataAdapter object uses commands to update the data source after changes
have been made to the DataSet. Using the Fill method of the DataAdapter calls the
SELECT command; using the Update method calls the INSERT, UPDATE or
DELETE command for each changed row. You can explicitly set these commands in
order to control the statements used at runtime to resolve changes, including the use
of stored procedures. For ad-hoc scenarios, a CommandBuilder object can generate
these at run-time based upon a select statement. However, this run-time generation
requires an extra round-trip to the server in order to gather required metadata, so
explicitly providing the INSERT, UPDATE, and DELETE commands at design time
will result in better run-time performance.
System analysis and design focus on systems, processes, and technology. Having
a firm group of the makeup of the system in question is a prerequisite for selecting the
procedure of introducing the computer for implementation. Thus a background in system
concepts and a familiarity with the ways organizations function are helpful.
System analysis and design for information system were founded in general
system theory, which emphasizes a close work at all parts of a system. Too other analyses
focus on only one component and over look other equally important component. General
system theory is concerned with “developing a systematic theoratical frame work. Upon which
to make decisions”
The term system is divided from Greek word system, which means an
organization relationship among functioning units or components. A system exists because it is
designed to achieve one or more objectives. We come in to daily contact with the
transportation system, the telephone system, the accounting system etc. There are a more than
hundred definitions of the word system, but most seem to have a common thread that suggests
that a system is an orderly grouping of inter development. Components linked together
according to plan to active a specific objective.
The study of system concepts, then has three basic implications –
1. System must be designed to achieve a predetermined objective.
2. Interrelationship and interdepedence must exist among the concept.
3. The objectives of the organization as a whole have a higher priority
than
The objectives of its subsystem, for example, computerizing personal application must confirm
to the organizations policy on privacy confidently and security, as available to the according
division on request.
Our System –
Our definitions of a system suggest some characteristics that one present in all
System Organization, introduction, interdependence, integration and central objectives.
Our system is Automation of Pay Role. This system manages five tasks –
1. Staff Bio-data: - This section intends to fulfill all the entries Related with the new
Staff. These entries include the personnel details of employees (like address, phone number
etc.). This information managed in to table “Bio-data” of our project.
3. Office Works Entry: - The test marks related to any Staff of any subjects are
managed by using table “Office works” in our project.
4. Manager: - The names of the Manager of the department are inputted into table
in our project.
System analysis refers to the process of examining a business situation with the
intent of improving it through better procedure and methods system analysis is the process of
gathering and interpreting facts, diagnosing problems and using the information to recommend
improvement to the system. System analysis specifies what the system should. From above
definition I concluded that it is a management technique which in designing a new system
improving existing system or solving a system problem. Requirement analysis is a software
task that bridges the gap between system level requirement engineering and software design.
There are four basic elements.
a) Input: Once I know goals I can easily determine what the input should be. If the
information is important to the system, I should make all possible efforts to make it available.
Sometimes it may be too costly to get the information from top management. So I have to
prepare cost benefit analysis. Following are the main elements of input.
1. Accuracy
2. Timeliness
3. Proper format
4. Economy
In my project context the information of the filed associated with Network Marketing are the
inputs. For saving the time and economy I make the proper forms. These forms will be proved
helpful for end users.
b) Functions: Most of the functions are compulsory for the system may be these
are related with user information or system based. Functions like Password Generation.
Automatic UserID Generation is system based.
c) Processes: Process involves the program and the procedure in which inputs are
converted to outputs. The processing contains a set of logical steps. This series is compulsory
for programming.
IDENTIFICATION OF NEED
The first step of system analysis process involves the identification of need. The
system analyst (system engineer) meets with the customer & the end-users (if different from
the customer). Identification of need is the starting point in the evolution of a computer-based
system.
The need of the new system is those features or details that must be incorporated
to produced the improvements in the existing system. In other words, they are the activities or
improvements that the new system must provide. They are determined by comparing current
performance with the objectives for acceptable systems performance.
The analyst assists the customer on defining the goals of the system.
What information will be produced i.e. identifying outputs?
What information is to be consumed i.e. identifying inputs. What functions and transformation
are required?
PRELIMINARY INVESTIGATION
The first step in the system development life cycle is the preliminary investigation
to determine the feasibility of the system. In this kind of investigation I evaluate the merits of
the project request and make an informed judgment about the feasibility of the proposed
project.
This investigation accomplished the objectives, which are as follows-
Clarify and understand the project request.
Determine the size of the project.
Assess cost and benefits of alternative approaches.
Determine the technical and operational feasibility of alternative approaches.
Report the findings to management, with recommendations outlining the acceptance or
rejection of the proposal.
On-site observations:
It is the process of recognizing and nothing people, objects and occurrences to obtain
the information. The major objective of on site observation is to get close as possible to real
system. The information about the present workflow, objects, people involved were gathered
in this way. The physical layout of current system, the location and movement of people/staff
were analyzed.
For collecting data, I did on site observation. In this, I observed the activities of system
directly. Out purpose of on-site observation was to get as chose as possible to the real system
being studied. During on-site observation, I seen the office environment, workload of the
system and the users, methods of work and the facilities provided by the organization to the
users, which can help the developer to understand the processes and procedures in
development of new system.
Conducting interviews:
Written documents and the on-site observation technique just tell that how the system
should operate, but they may not include enough details to allow a decision to be made about
the merits of a systems proposal, nor do they present user views about current operations.
So, I use interviews to learn these details. Interviews allow me to learn more abut the
nature of the project request and the reason for submitting it. Interviews provide details that
further explain the project and show whether assistance is merited economically, operationally
and technically.
Once a preliminary area of application has been identified, it may then be subjected to a more
rigorous examination in a feasibility study.
1FEASIBILITY STUDY
A feasibility study is carried out to select the best system that meets performance
requirements. Feasibility is the determination of whether or not a project is worth doing. The
process followed in making this determination is called a feasibility study. This type of study
determines if a project can and should be taken.
Since the feasibility study may lead to the commitment of large resources, it
becomes necessary that it should be conducted competently and that no fundamental errors of
judgment are made.
TECHNICAL FEASIBILITY
This is concerned with specifying equipment and software that will successfully
satisfy the user requirement. The technical needs of the system may include.
The facility to produce outputs in a given time.
Response time under certain conditions.
Ability to process a certain volume of transaction at a particular speed.
Facility to communicate data to distant location.
In examining technical feasibility, configuration of the system is given more
importance than the actual make of hardware. The configuration should give the complete
picture about the system’s requirements.
The proposed system is technically feasible because of following reasons :
The organization already has server-client setup so this system can run in organisation.
The organization does not require any new package in their computer system as it
already has all the required software.
The organization is already working with the system make in dbase3 and has knowledge
about the processes, databases, expected outputs
The proposed system is to be implemented using COM technology, the server
components are to be installed on server and client components on client. All the Clients will
get response according to the application loaded on their terminals.
OPERATIONAL FEASIBILITY
It is mainly related to human organizational and political aspects. The points to be
considered are :
What changes will be brought with the system?
What organizational structures are disturbed?
What new skills will be required? Do the existing staff members have these skills? If
not, can they be trained in due course of time?
Such considerations are likely to critically affect the nature and scope of the eventual
recommendations. This feasibility study is carried out by a small group of people who are
familiar with information system techniques, who understand the parts of the business that are
relevant to the project and are skilled in system analysis and design process.
The proposed system is operationally feasible because of following reasons :
The organization is already working with the computerized system and has knowledge
about functioning and database.
The interactivity of existing system is very poor as each command has to be written
manually on the system and the proposed system is much faster.
The proposed system is better in use and user-friendlier as it generates proper message at
run time.
The input from the user is much as fields like USER ID, PASSWORD, are generated by
the system itself.
ECONOMIC FEASIBILITY
Economic analysis is the most frequently used technique for evaluating the
effectiveness of a proposed system. In this I determine the benefits and savings that are
expected from a proposed system and compare them with costs. It benefits outweigh costs; a
decision is taken to design and implement the system. This is an ongoing effort that improves
in accuracy at each phase of the system life cycle.
The proposed system is financial feasible because of following reasons:
The cost of system development is nil as a trainee Staff is developing it.
The organization already has server-client setup so this system can run in organization
so hardware investment is nil.
The organization does not require any new package in their computer system as it
already has all the required software.
The proposed system is economic, as it will reduce the time investment in running the
daily transactions.
The employees are already working with the old system so cost of learning to work with
new system is very low.
The system itself will be very much interactive and self-explanatory and also contains
user manuals to assist operator.
No extra cost for salary of operational staff.
DESIGN:
It is multi step process where we focused on four distinct attributes of the project
data structure, software architecture, interface representation and algorithmic detail. First of all
I according to the need of the project designed the tables and various fields of the data
structure. After designing the tables I decided upon the type of software architecture best
suited for this project and the language to be used and the tier system to be applied. The whole
design process was then translated into a representation of the software that was assessed for
quality before embarking on the process of coding. The design is documented and is part of
software configuration.
CODING:
After designing process is completed, the implementation of this design was done
through the actual coding. Coding is the way by which design is translated into machine
readable form. Since, I gave large chunk of development time in designing of the project,
actual code became an easy job to handle.
TESTING:
After code generation testing begins. The testing process focuses on the logical
internals of the software, assuring that all statement have been tested, and on the functional
externals- that is conducting tests to uncover and ensure that defined input will produce actual
results that agree with required results.
MAINTENANCE:
Software will undoubtedly undergo change after it is delivered to the
organization. Changes occur because errors have been encountered, because the software must
be accommodate changes to external environment or because the organization required
functional or performance enhancements. Software maintenance is applied to each of the
preceding phases to an existing program rather than a new one.
System Requirements
Review & Validation User Requirement Specifications
.
Software Requirements
Review & Validation Software Requirements
Preliminary Design
Detailed Functional Specifications
Review & Validation
Testing Testing
Review & Validation
Maintenance
Review & Validation Maintenance
NORMALIZATION
It is a process of converting a relation to a standard form. The process is used to handle the problems
that can arise due to data redundancy i.e. repetition of data in the database, maintain data integrity as
well as handling problems that can arise due to insertion, updating, deletion anomalies.
Transitive Dependency: If two non key attributes depend on each other as well as on the
primary key then they are said to be transitively dependent.
The above normalization principles were applied to decompose the data in multiple table
thereby making the data to be maintained in a consistent state.
Data Dictionary:
After carefully understanding the requirements of the client the entire data storage
requirements are divided into tables. The below tables are normalized to avoid any anomalies
during the course of data entry.
ENTITY
RELATIONSHIP
DIAGRAM
Entity Relationship Diagram
Data flow
Diagram
DATA FLOW DIAGRAM:
A data flow diagram is graphical tool used to describe and analyze movement of data through
a system. These are the central tool and the basis from which the other components are
developed. The transformation of data from input to output, through processed, may be
described logically and independently of physical components associated with the system.
These are known as the logical data flow diagrams. The physical data flow diagrams show the
actual implements and movement of data between people, departments and workstations. A
full description of a system actually consists of a set of data flow diagrams. Using two
familiar notations Yourdon, Gene and Carson notation develops the data flow diagrams. Each
component in a DFD is labeled with a descriptive name. Process is further identified with a
number that will be used for identification purpose. The development of DFD’s is done in
several levels. Each process in lower level diagrams can be broken down into a more detailed
DFD in the next level. The lop-level diagram is often called context diagram. It consists a
single process bit, which plays vital role in studying the current system. The process in the
context level diagram is exploded into other process at the first level DFD.
The idea behind the explosion of a process into more process is that understanding at
one level of detail is exploded into greater detail at the next level. This is done until further
explosion is necessary and an adequate amount of detail is described for analyst to understand
the process.
Larry Constantine first developed the DFD as a way of expressing system requirements
in a graphical from, this lead to the modular design.
A DFD is also known as a “bubble Chart” has the purpose of clarifying system
requirements and identifying major transformations that will become programs in system
design. So it is the starting point of the design to the lowest level of detail. A DFD consists of
a series of bubbles joined by data flows in the system.
DFD SYMBOLS:
Data flow
Data Store
CONSTRUCTING A DFD:
SOURCE OR SINK
The origin and /or destination of data.
1) Data cannot move direly from a source to sink it must be moved by a process
2) A source and /or sink has a noun phrase land
DATA FLOW
1) A Data Flow has only one direction of flow between symbol. It may flow in both
directions between a process and a data store to show a read before an update. The later is
usually indicated however by two separate arrows since these happen at different type.
2) A join in DFD means that exactly the same data comes from any of two or more
different processes data store or sink to a common location.
3) A data flow cannot go directly back to the same process it leads. There must be atleast
one other process that handles the data flow produce some other data flow returns the original
data into the beginning process.
4) A Data flow to a data store means update (delete or change).
5) A data Flow from a data store means retrieve or use.
6) A data flow has a noun phrase label more than one data flow noun phrase can appear on
a single arrow as long as all of the flows on the same arrow move together as one package.
Database Stored on
MS Access
Report Generate
DATA FLOW DIAGRAM (DFD)
DFD is a model, which gives the insight into the information domain and
functional domain at the same time. DFD is refined into different levels. The more
refined DFD is, more details of the system are incorporated. In the process of
creating a DFD, we decompose the system into different functional subsystems.
The DFD refinement results in a corresponding refinement of data.
Following is the DFD of the “Proposed System”. We have refined the system up to
two levels. Each break-up has been numbered as per the rule of DFD. We have tried
to incorporate all the details of the system but there is some chance of further
improvisation because of the study that is still going on for the project development.
This level shows the overall context of the system and it's operating environment and
shows the whole system as just one process.
Member details
Database Operations on
MEMBERS record
Member list
GYM CENTER’S
Database Operations on
RECORD
EMPLOYEES record
Reciept MANAGEMEN-T Member’s Record
SYSTEM
Database Operations on
Employee details INVENTIRY record
Employee schedule
IInventory(orders
and products)
About
Database
INTRODUCATION TO RELATIONAL
DATABASE
INTRODUCATION:
Any industry needs proper planning of manpower and materials for optimum producing. The
pertinent need for the processing of data related to production planning is accepted and the
most basic for modern management computerized data base system is operational
for storage retrieval and processing of the data pertaining to Dockets, materials and stores.
The present work relates to the design and operationalization of DBMS system integrating the
CAD and CAM application developed for transformers However the
same can be customized and used for other similar kinds of application hire to the data
processing been operational or computers using the programming environment such as
COBOL .However since mid seventies Data Base Management system (DBMS) has emerged
outs the outstanding package for data storage Administration ,retrieval and effective
management it also supports a high-level query language for most effective query and report
generation in fact the DBMS supersede the programming approach to data procession in most
convincing way. The present work elaborates this point still further.
Definition:
According to James Martin, in “Computer Database Organization”, a database is:” A
collection of interrelated data stored together without harmful or unnecessary
Redundancy to serve multiple application; the data are stores so they are independent
of programs which use the data; a common controlled approach is used on adding new data
and modifying and retrieving existing data within the database the data is structures so as to
provided foundation for future application development.”
There is a controlled approach to adding new data, modifying and retrieving new data
and defining new data. Database administrator (DBA) usually performs this function.
Definition of D.B.M.S.:
Database management systems (DBMS) are software that provides for simultaneous use
of database by multiple users, and provide tools for accessing and manipulating the Data in
database.
Different database management system organizes the data in the database in different
ways the organizational structure used by oracle follows the relational DBMS model.
Therefore. Oracle
is a”Relational database management system”.
The following terms are commonly used to describe the data procedures and
Data in a relation database.
COLUMN
End Sub
End Sub
Private Sub Button1_Click(ByVal sender As System.Object, ByVal e As
System.EventArgs) Handles Button1.Click
Try
con.Open()
cmd = New OleDb.OleDbCommand("select * from Acc_Info where ([Acc_ID]='"
& InputBox("Enter The Id") & "')", con)
DR = cmd.ExecuteReader
If DR.Read = True Then
TextBox1.Text = DR(0)
TextBox2.Text = DR(1)
TextBox3.Text = DR(2)
TextBox4.Text = DR(3)
TextBox5.Text = DR(4)
TextBox6.Text = DR(5)
TextBox7.Text = DR(6)
TextBox8.Text = DR(7)
DateTimePicker1.Text = DR(8)
Else
MessageBox.Show("Record not found......")
End If
cmd.Dispose()
con.Close()
Catch ex As Exception
MessageBox.Show(ex.Message, "ERROR", MessageBoxButtons.OK,
MessageBoxIcon.Error)
End Try
End Sub
Try
Dim cmd As New OleDbCommand("insert into Acc_Info values('" &
TextBox1.Text & "', '" & TextBox2.Text & "','" & TextBox3.Text & "','" &
TextBox4.Text & "','" & TextBox5.Text & "','" & TextBox6.Text & "','" &
TextBox7.Text & "','" & TextBox8.Text & "','" & DateTimePicker1.Text & "')", con)
con.Open()
cmd.ExecuteNonQuery()
Catch ex As Exception
MsgBox(ex.Message.ToString)
Finally
new1()
con.Close()
End Try
Else
MsgBox("Data Saved.........")
End If
Dim a As DialogResult = MsgBox("Are you sure You want Input Record Now ",
MsgBoxStyle.YesNo, "Varify...")
If a Then
End If
End Sub
Once the software is delivered and developed, it enters the maintenance phase. All
systems need maintenance. Software needs to be maintained because there are often
some residual errors or bugs remaining in the system that must be removed as they
are discovered. Many of these surfaces only after the system has been in operation
sometimes for a long time. These errors once discovered need to be removed, leading
documents), understanding the effects of change, making the changes-to both the
code and documents-testing the new parts and retesting the old part.
Conclusion
CONCLUSION
the gym, gym equipment details etc. This software package allows
storing the details of all the data related to gymnasium. The system