Professional Documents
Culture Documents
Software Engineering
Q.1 Solve any five :
a) What is software engineering ? Enlist s/w application domains.
Applications:
1. Web development
2. Scientific development
3. Business development
4. Medical devices
5. Industrial and process control
6. Embedded systems
Ans :-
c) What is data dictionary ? What is its objective ?
Objective :
Ans :-
Step 3 : Design
Step 4 : Prototyping
Step 5 : Evaluation
Ans :-
1. Pipe and filter architecture
2. Object oriented architecture
3. Layered architecture
4. Data- centered architecture
5. Interpreter
6. Event based implicit invocation
7. Process control
Ans :-
Functionality.
Performance.
Reliability.
Testability.
Availability.
Interoperability.
Security.
Flexibility.
Q.2
a) Explain waterfall model, what are its limitations ?
Ans :- The Waterfall Model was the first Process Model to be introduced. It is
also referred to as a linear-sequential life cycle model. It is very simple to
understand and use. In a waterfall model, each phase must be completed before
the next phase can begin and there is no overlapping in the phases.
The Waterfall model is the earliest SDLC approach that was used for software
development.
Limitations :
Ans :-
Q.3
a) Explain requirement elicitation techniques for software.
Interviewing :
Ethnography :
1. Requirements derived from the way in which people actually work, rather
than the way in which business process definitions say they ought to
work. In practice, people never follow formal processes. For example, air
traffic controllers may switch off a conflict alert system that detects
aircraft with intersecting flight paths, even though normal control
procedures specify that it should be used. The conflict alert system is
sensitive and issues audible warnings even when planes are far apart.
Controllers may find these distracting and prefer to use other strategies to
ensure that planes are not on conflicting flight paths.
2. Requirements derived from cooperation and awareness of other people’s
activities. For example, air traffic controllers (ATCs) may use an
awareness of other controllers work to predict the number of aircraft that
will be entering their control sector. They then modify their control
strategies depending on that predicted workload. Therefore, an automated
ATC system should allow controllers in a sector to have some visibility of
the work in adjacent sectors.
Ans :-
1. Data object :
The data object is actually a location or region of storage that contains a
collection of attributes or groups of values that act as an aspect, characteristic,
quality, or descriptor of the object. A vehicle is a data object which can be
defined or described with the help of a set of attributes or data.
2. Attributes :
Attributes define the properties of a data object. The attribute is a quality or
characteristic that defines a person, group, or data objects. It is actually the
properties that define the type of entity. An attribute can have a single or
multiple or range of values as per our needs.
1. Naming attributes –
To name an instance of a data object, naming attributes are used. User
naming attributes identify user objects such as Login_names and User_Id
for some security purpose. For example- Make and model are naming
attributes in a vehicle data object.
2. Descriptive attributes –
These attributes are used to describe the characteristics or features or the
relationship of the data object. Sometimes also referred to as relationship
attributes. For example- In a vehicle, the color of a data object is a
descriptive attribute that describes the features of the object.
3. Referential attribute –
These are the attributes that are used to formalize binary and associative
relationships and in making reference to another instance in another
table. For example- The data object is a referential attribute in a vehicle.
3. Relationship :
For example, toy and shopkeeper are two objects that share the following
relationship:
The Shopkeeper order toys.
The shopkeeper sells toys.
The shopkeeper shows toys.
The Shopkeeper stocks toys.
4. Cardinality :
of [object] 'B,' and an occurrence of 'B' can relate to only one occurrence
of 'A.'
For example, a mother can have many children, but a child can have only
one mother.
Many-to-many (M:N)—An occurrence of [object] 'A' can relate to one or
more occurrences
5. Modality :
The Modality is completely different from the cardinality. Its value is computed
as “o” when there is no requirement for the relationship to occur or if the
relationship is optional. The modality value is “1” if there is a compulsion for the
occurrence of a relationship. In simple words, it describes whether a relationship
between two or more entities is even required or not.
Q.4
a) What is functional modeling? Explain a data flow model.
Ans :-
FUNCTIONAL MODELS :
The functional model is the third leg of the OMT methodology in addition to the
Object Model and Dynamic Model. “The functional model specifies the results of
a computation specifying how or when they are computed”. The functional model
specifies the meaning of the operations in the object model and the actions in
the dynamic model, as well as any constraints in the object model.
the functional model consists of multiple data flow diagrams which specify the
meanings, of operations and constraints. A data flow diagram (DFD) shows the
functional relationships of the values computed by a system, including input
values, output values, and internal data stores. “A data flow diagram is a graph
which shows the flow of data values from their sources in objects through
processes that transform them to their destinations in other objects”. DFDs do
not show control information, such as the time at which processes are executed,
or decisions among alternate data paths. This type of information belongs to the
dynamic model. Also, the arrangement values into object are shown by the
object model, but not by the data flow diagram.
A data flow diagram contains processes which transform data, data flows which
move data, actor objects which produce and consume data, and data store
objects that store data passively. Figure 1 shows a data flow diagram for the
display of an icon on a windowing system. Here in this figure, the icon name and
location are inputs to the diagram from an unspecified source. The icon is
expanded to vectors in the application coordinate system using existing icon
definitions. The vectors are clipped to the size of the window, then offset by the
location of the window on the screen, to obtain vectors in the screen coordinate
system. Finally, the vectors are converted to pixel operations that are sent to the
screen buffer for display. The data flow diagram represents the sequence of
transformations performed, as well as the external values and objects that affect
the computation process.
A data flow connects the output of an object or process to the input of another
object or process. It represents an intermediate data value within a
computation. The value is not changed by the data flow.
Each data flow represents a value at some point in the computation. Flows on
the boundary of a data flow diagram are its inputs and outputs. These flows may
be unconnected (if the diagram is a fragment of a complete system), or they
may be connected to objects.
b) With suitable examples explain modularity functional independence &
refinement.
Ans :-
Modularity :
When developing software, the software is broken into smaller and smaller
components, into packages of classes, then into the classes themselves, into the
base data-types that make up these classes, into the functions that they call,
and so on.
Functional independence :
Refinement :
Stepwise refinement is the idea that software is developed by moving through
the levels of abstraction, beginning at higher levels and, incrementally refining
the software through each level of abstraction, providing more detail at each
increment. At higher levels, the software is merely its design models; at lower
levels there will be some code; at the lowest level the software has been
completely developed.
At the early steps of the refinement process the software engineer does not
necessarily know how the software will perform what it needs to do. This is
determined at each successive refinement step, as the design and the software
is elaborated upon.
A well-architected system can help you avoid repeating code, simplify the
integration of components developed by different teams, and improve the overall
quality and security of the software.
2) Behavior modeling.
Ans :-
1. Scenario based element This type of element represents the system user
point of view. Scenario based elements are use case diagram, user stories.
2. Class based elements The object of this type of element manipulated by the
system. It defines the object, attributes and relationship. The collaboration is
occurring between the classes. Class based elements are the class diagram,
collaboration diagram.
The flow elements are data flow diagram, control flow diagram.
Ans :-
Software Project :
Identifying and addressing issues that may hinder project progress and success.
Ans :- SQA metrics are numerical values that represent some aspect of software
quality or the SQA process. They can be derived from various sources, such as
code analysis, testing results, defect reports, user feedback, or project
documentation.
Ans :-
Ans :- The practice of estimating how long it will take to accomplish several jobs
and the whole endeavor is known as time estimation for software projects. As a
firm, it's essential to realize that precise time prediction helps in planning,
resource allocation, and realistic project deadline setting.
Q.7
a) Explain agility principles, what is the impact of human factors ?
Ans :-
Agility Principles :
In 2005, Miller et al. discussed that ethical analysis methods and related topics,
which can inform a discussion about software development techniques when
human values and ethical principles are considered, they suggested that all
software engineers should have skill in some kind of ethical analysis, as well as
another two human factors, utilitarian analysis and Literature Review 45
deontological analysis. The former helps a software engineer to think about
consequences for developers, customers, users, and anyone else whose life may
be affected by the software developed. The latter pushes a software developer
into somewhat different emphases [5].
Regular feedback loops concerning the system, design, or process keep the team
aligned and informed of the progress. Courage enables developers to respond to
changing requirements even late into the development cycle, affording
continuous improvements and flexibility.
Q.8
a) Explain the process of software project planning.
Software development is a sort of all new streams in world business, and there's
next to no involvement in structure programming items. Most programming
items are customized to accommodate customer's necessities. The most
significant is that the underlying technology changes and advances so generally
and rapidly that experience of one element may not be connected to the other
one. All such business and ecological imperatives bring risk in software
development; hence, it is fundamental to manage software projects efficiently.
Software Project planning starts before technical work start. The various steps of
planning activities are :
The size is the crucial parameter for the estimation of other activities. Resources
requirement are required based on cost and development time. Project schedule
may prove to be very useful for controlling and monitoring the progress of the
project. This is dependent on resources & development time.
Ans :-
Software Scope :
The software scope outlines the boundaries, goals, and deliverables of a software
project. It defines what the software will and will not do, detailing the
functionalities, features, and constraints. The scope is a crucial document that
helps manage expectations, prevent scope creep, and guide the development
team throughout the project.
Software Feasibility :
Ans :- Software Quality shows how good and reliable a product is. To convey an
associate degree example, think about functionally correct software. It performs
all functions as laid out in the SRS document. But, it has an associate degree
virtually unusable program. even though it should be functionally correct, we
tend not to think about it to be a high-quality product.
Another example is also that of a product that will have everything that the
users need but has an associate degree virtually incomprehensible and not
maintainable code. Therefore, the normal construct of quality as “fitness of
purpose” for code merchandise isn’t satisfactory.
Here is a new list of things that can be done to improve the quality of software :
b) Explain
Ans :- Six Sigma is the process of producing high and improved quality output.
This can be done in two phases – identification and elimination. The cause of
defects is identified and appropriate elimination is done which reduces variation
in whole processes. A six sigma method is one in which 99.99966% of all the
products to be produced have the same features and are of free from defects.
Characteristics of Six Sigma :
The Characteristics of Six Sigma are as follows :
Ans :- The ISO 9000 series was created by the International Organization for
Standardization (ISO) as international requirements and guidelines for quality
management systems. It was originally introduced in 1987 and over the years
has established itself in the global economy having been adopted in over 178
countries with over one million registrations.
The phrase “ISO 9000 family” or “ISO 9000 series” refers to a group of quality
management standards which are process standards (not product standards).
AUP is designed to deliver high-quality software that meets the changing needs
of the stakeholders in an efficient and effective manner.
2) COCOMO model.
Ans :- The Cocomo Model is a procedural cost estimate model for software
projects and is often used as a process of reliably predicting the various
parameters associated with making a project such as size, effort, cost, time, and
quality. It was proposed by Barry Boehm in 1981 and is based on the study of 63
projects, which makes it one of the best-documented models.
Where,
KLOC is the estimated size of the software product indicate in Kilo Lines
of Code,
2. Intermediate Model : The basic Cocomo model considers that the effort
is only a function of the number of lines of code and some constants
calculated according to the various software systems. The intermediate
COCOMO model recognizes these facts and refines the initial estimates
obtained through the basic COCOMO model by using a set of 15 cost
drivers based on various attributes of software engineering.
3) RMMM plan.
Ans :- A risk management technique is usually seen in the software Project plan.
This can be divided into Risk Mitigation, Monitoring, and Management Plan
(RMMM). In this plan, all works are done as part of risk analysis. As part of the
overall project plan project manager generally uses this RMMM plan.
In some software teams, risk is documented with the help of a Risk Information
Sheet (RIS). This RIS is controlled by using a database system for easier
management of information i.e. creation, priority ordering, searching, and other
analysis. After documentation of RMMM and start of a project, risk mitigation
and monitoring steps will start.
Risk Mitigation :
It is an activity used to avoid problems (Risk Avoidance).
Steps for mitigating the risks as follows.
1. Finding out the risk.
2. Removing causes that are the reason for risk creation.
3. Controlling the corresponding documents from time to time.
4. Conducting timely reviews to speed up the work.
Risk Monitoring :
It is an activity used for project tracking.
It has the following primary objectives as follows.
1. To check if predicted risks occur or not.
2. To ensure proper application of risk aversion steps defined for risk.
3. To collect data for future risk analysis.
4. To allocate what problems are caused by which risks throughout the
project.
Risk Management and Planning :
It assumes that the mitigation activity failed and the risk is a reality. This
task is done by Project manager when risk becomes reality and causes
severe problems. If the project manager effectively uses project
mitigation to remove risks successfully then it is easier to manage the
risks.
This shows that the response that will be taken for each risk by a
manager. The main objective of the risk management plan is the risk
register. This risk register describes and focuses on the predicted threats
to a software project.
4) SQA plan.
Ans :-
An SQA plan will work alongside the standard development, prototyping, design,
production, and release cycle for a software product or service. For easy
documentation and referencing, an SQA plan will have different sections like
purpose, references, configuration and management, problem reporting and
corrective actions, tools, code controls, testing methodology, and more.