Professional Documents
Culture Documents
IMPORTANT QUESTION
Software requirement analysis : to understand the nature of the program to be Built, the
software engineer must understand the information domain for the Software, as well as
the required function, behavior, performance and interface.
In a Ticket reservation system, various requirements of the whole system are Viewed. It
is determined what are the requirements of the system, who are the users of the system,
what are various functions in the system etc. then ER – diagram, Data flow diagram etc
are prepared.
Design : Software design is actually a multistep process that focuses on four distinct
attributes of a program : data structure, software architecture, interface representations
and procedural detail.
Code generation : The code generation the design into machine readable form. If design
performed in a detailed manner, code generation can be accomplished mechanically.
Testing : Once the code has been generated, program testing begins. The testing process
focuses on the logical internals of the software, ensuring that all statements have been
tested and on the functional externals.
The linear sequential model is appropriate for Tickets Reservation system as it follows
all the steps necessary for the development of Ticket reservation system. In this system
project planning can be made in advance and Ticket reservation system require
comparatively less changes at later stage. Moreover in a Ticket reservation system most
of the requirements can be stated in earlier stages. It provides templates for which method
for analysis, design coding, testing and support can be placed. There is less need to make
changes after the system is developed. Therefore waterfall model is appropriate for
tickets reservation system.
Ans 1 (b)
A collection of elements which are assembled to fulfill some defined purpose. Elements
may be hardware or software components, organizational policies and procedures and
operational processes.
Systems have properties which are emergent i.e. they only come to light when the parts
are put together, they have structure and mechanisms for communication and control.
Ans 1 (d)
Risk management is the process of measuring or assessing risk and developing strategies
to manage it. Strategies include transferring the risk to another party, avoiding the risk,
reducing the negative effect of the risk, and accepting some or all of the consequences of
a particular risk. Traditional risk management focuses on risks stemming from physical
or legal causes (e.g. natural disasters or fires, accidents, death, and lawsuits). Financial
risk management, on the other hand, focuses on risks that can be managed using traded
financial instruments.
In ideal risk management a prioritization process is followed whereby the risks with the
greatest loss and greatest probability of occurring are handled first, and risks with lower
probability of occurrence and lower loss are handled later. In practice the process can be
very difficult, and balancing between risks with a high p Risk management also faces
difficulties allocating resources. This is the idea of opportunity cost. Resources spent on
risk management could have been spent on more profitable activities. Again, ideal risk
management minimizes spending while maximizing the reduction of the negative effects
of risks.
• Source analysis Risk sources may be internal or external to the system that is the
target of risk management. Examples of risk sources are : stakeholders of a
project, employees of a company or the weather over an airport.
• Problem analysis Risk are related to identify threats. For example : the threat of
losing money, the threat abuse of privacy information or the threat of accidents
and casualties. The threats may exist with various entities, most important with
shareholder, customers and legislative bodies such as the government.
Decide on the combination of methods to be used for each risk. Each risk management
decision should be recorded and approved by the appropriate level of management . for
example, a risk control implementation and responsible persons for those actions . the
risk management concept is old but is still not very effectively measured.
Implementation
Follow all of the planned methods for mitigating the effect of the risks. Purchase
insurance policies for the risks that have been decided to be transferred to an insurer,
avoid al risks that have been decided to be transferred to an insurer, avoid all that can be
avoided without sacrificing the entity’s goals, reduce others, and retain the rest.
Risk analysis results and management plans should be updated periodically. There are
two primary reasons for this :
1. to evaluate whether the previously selected security controls are still
applicable and effective, and
2. to evaluate the possible risk level changes in the business
environment. For example, information risks are a good example of
rapidly changing business environment.
The cost of performing software Inspections includes the individual preparation effort of
each participant before the session and the conduct effort of participants in the
inspections session. Typically, 4-5 people participate and expend 1-2 hours of preparation
and 1-2 hours of conduct each. This cost of 10 to 20 hours of total effort per session
results in the early detection of 5- 10 defects in 250-5—lines of new development code or
1000-1500 lines of legacy.
Ans 1 (f)
The knowledge of all tools required white making of project. Review of the project
analysis before starting the project. Before starting the project. Before starting the making
project software engineer should the person who is going to make is well aware that the
all tools which required making the project. Cost estimated should be under the control.
Time limitation should be there so that software will complete on time.
Ans :- 1 (g)
Software Requirements :-
1. Operating System.
2. Ms office
3. The required s/w which we make in platform the s/w
Hardware Requirement :-
Inter P1v Dual core processor
Intel p 1 V Motherboard
Ram 512 MB
HDFD 80 GB
ATX Cabinet
Keyboard
Optical Mouse
Ans 1 (h)
When components are being connected to create large components they have to pass
through integration testing, whose main purpose is to detect any inconsistency between
the connected components. Once the integration test is completed , a component test will
have to be performed on the new component, consisting of several smaller ones. Some
properties, however, can be said to belong to the system as a whole, like certain quality
attributes. The system, beings a huge component, will hence be tested in its entirety to
verify that these requirements are met in a satisfactory way, and this is what is referred to
a system testing. Starting with unit testing, then moving on to integration testing before
component testing can be performed. Finally, a system test is executed.
Web server
Browser
Client Database server
Application code
White box
Black box
Testing
Application data
Structural testing or white box testing, on the other hand, implies studying the program
code and testing the different parts of it. This type of testing is not capable of finding all
kinds of errors, but to its advantage it is easier to determine when you have tested enough
That white box testing is capable all lines of code when given an finite amount of time,
but points out that such testing might not uncover errors with respect to component
integration Seem to contradict each other. In spite of this, our interpretation suggests that
both of them find a combination of white box and black box testing.
Ans 1 (i)
Software Inspections are a disciplined engineering practice for detecting and correcting
defects in software artifacts, and preventing their leakage into filed operations
Merit List
Application for
Confirmation
Technical Detail
Software Inspections are strict and close examinations conducted on requirements,
specifications, architectures, designs, code, test plan and procedures, and other artifacts.
Leading software indicators of excellence for each artifact type provide the exit criteria
for the activities of the software life cycle. For example, these indicators include
completeness, correctness, style, rules of construction, and multiple views.
The adoption of software Inspections practice is competency enhancing and meet little
resistance among practitioners trained in their use. The adopting organization benefits by
improved predictability in cost and schedule performance, reduced cost of development
and maintenance, reduced defects, in the filed, increased customer satisfaction, and
improved morale among practitioners.
Dependencies
In order for Software Inspections to be systematically used in statistical process control,
there must be a life cycle model with defined software artifacts. In this context, software
Inspections provide the exit criteria for each life cycle activity. Furthermore, the standard
of excellence of leading indicators for each type of artifact must be specified and used in
practice.
Q2 : (a) You are browsing a web based application but it is taking too much to open,
list
Any five reasons for the same.
Ans 2(a)
1. To many temporary file can lead the show speed.
2. Bad Connection.
3. Pink Break
4. Site is running slowly for everyone.
5. Site has too many views so it is being bogged down.
6. Server Busy
7. Small bandwidth
8. RAM size small
9. Client System busy.
10. Internet speed slow.
(b) Do you anticipate any situation where usage of Clean room Software
Engineering for application development is not appropriate ? Explain your
answer.
Clean room software engineering is an engineering and managerial process for like
development of high quality software with certified reliability. The name “Clean room”
was taken from the electronics industry, where a physical clear room exists to prevent
introduction of defects during hardware fabrication. Clean room software engineering
reflects the same emphasis on defect prevention rather than defect removal , as well as
certification of reliability for the intended environment of use.
The focus of Clean room involves moving from traditional, craft based software
development practices to rigorous, engineering based practices. Clean room software
engineering yields software that is correct by mathematically sound design, and software
that is certified by statistically valid testing. Reduced cycle time results from an
incremental development strategy and the avoidance of rework.
Clean room reduces the cost of errors during development and the incidence of failures
during operation; thus the overall life cycle cost of software developed under Clean room
can be expected to be far lower than industry average.
The following ideas form the foundation for clean room based development :
Incremental development under statistical quality control (SQC). Incremental
development as practiced in Clean room provides a basis for statistical quality control of
the development process. Each increment is a complete iteration of the process, and
measures of performance in each increment are compared with reestablished standards to
determine whether or not the process is in control. If quality standards are not met,
testing of the increment ceases and development return to the design stage.
Software testing based on statistical principal. In clean room, software testing is viewed
as a statistical experiment. A representative subset of all possible uses of the software is
generated, and performance of the subset is used as a basis for conclusion about general
operational performance.
Usage considerations
Clean room has been documented to be very effective in new development and
reengineering (whole system or major subunits) contexts. The following discussion
highlights areas where Clean room affects or differs from more conventional practice :
Team based development. A Clean room project team is small typically six to eight
persons, and works in a disciplined fashion to ensure the intellectual control of work in
progress. Clean room teamwork involves peer review of individual work, but does not
supplant individual creativity. Once the system architecture has been established and the
interfaces between subunits have been defined, individuals typically work alone on a
given system component. Individual designs are working drafts that then reviewed by the
team. In a large project, multiple small teams may be formed, with one for the
development of each subsystem, thus enabling concurrent engineering after the top level
architecture has been established.
Time allocation across life cycle phases. Because one of the major objectives of clean
room is to prevent errors from occurring, the amount of time spent in the design phase of
a clean room development is likely to be greater than the amount of time traditionally
development to design. Clear room, however, is not a more time consuming
development methodology, but its greater emphasis on design and verification often
yields that concern. Management understanding and acceptance of this essential point
that quality will be achieved by design rather than through testing must be reflected in
the development schedule. Design and verification will require the greatest portion of the
schedule. Testing may being later and be allocated less time than is ordinarily the case.
In large clean room projects, where historical data has enabled comparison of traditional
and clean room development schedules, the clean room schedule has equaled or
improved upon the usual development time.
• Empower employees
• Eliminates water, unnecessary management overhead, and obsolete or inefficient
processes.
• Enables revolutionary improvements in many business processes as measured by
quality and customer service.
• Help top organizations stay on top and low achievers to become effective
competitors.
DOD has suggested that the following six tasks be part of any functional management
approach to reengineering projects:
1. Define the framework . define functional objectives; determine the management
strategy to be followed in streamlining and standardizing processes; and establish
the process, data and information systems baselines from which to being process
improvement.
2. Analyze. Analyze business process to eliminate non value added processes;
simplify and streamline processes of little value; and identify more effective and
efficient alternatives to the process, data, and system baselines.
3. Evaluate. Conduct a preliminary, functional, economic analysis to evaluate
alternatives to baseline processes and select a preferred course of action.
4. Plan. Develop detailed statement of requirements, baseline impacts, costs,
benefits, and schedules to implement the planned course of action.
5. Approve. Finalize the functional economic analysis using information from the
planning data, and present to senior management for approval tp proceed with
proposed process improvement and any associated data or system changes.
6. Execute. Execute the approved process process and data changes, and provide
functional management oversight of any associated information system changes.