You are on page 1of 28

Model Answer

Software Engineering
Q.1 Solve any five :
a) What is software engineering ? Enlist s/w application domains.

Ans :- Software Engineering is the process of designing, developing, testing,


and maintaining software. It is a systematic and disciplined approach to software
development that aims to create high-quality, reliable, and maintainable
software.

Applications:

1. Web development
2. Scientific development
3. Business development
4. Medical devices
5. Industrial and process control
6. Embedded systems

b) Draw a diagram to show software engineering layers.

Ans :-
c) What is data dictionary ? What is its objective ?

Ans :- A Data Dictionary is a collection of names, definitions, and attributes


about data elements that are being used or captured in a database, information
system, or part of a research project. It describes the meanings and purposes of
data elements within the context of a project, and provides guidance on
interpretation, accepted meanings and representation. A Data Dictionary also
provides metadata about data elements. The metadata included in a Data
Dictionary can assist in defining the scope and characteristics of data elements,
as well the rules for their usage and application.

Objective :

 Assist in avoiding data inconsistencies across a project


 Help define conventions that are to be used across a project
 Provide consistency in the collection and use of data across multiple
members of a research team
 Make data easier to analyse
 Enforce the use of Data Standards

d) What is a use case diagram ?

Ans :- A use case diagram is used to represent the dynamic behaviour of a


system. It encapsulates the system's functionality by incorporating use cases,
actors, and their relationships. It models the tasks, services, and functions
required by a system/subsystem of an application. It depicts the high-level
functionality of a system and also tells how the user handles a system.

e) What are the steps in software design ?

Ans :-

Step 1 : Brainstorming & Understanding The Project Requirements

Step 2 : Research & Analysis

Step 3 : Design

Step 4 : Prototyping

Step 5 : Evaluation

f) Enlist software architecture styles.

Ans :-
1. Pipe and filter architecture
2. Object oriented architecture
3. Layered architecture
4. Data- centered architecture
5. Interpreter
6. Event based implicit invocation
7. Process control

g) What is information hiding ?

Ans :- Information hiding is a software design principle, where certain aspects of


a program or module (the “secrets”) are inaccessible to clients. The primary goal
is to prevent extensive modification to clients whenever the implementation
details of a module or program are changed.

h) List software quality guidelines.

Ans :-
 Functionality.
 Performance.
 Reliability.
 Testability.
 Availability.
 Interoperability.
 Security.
 Flexibility.

Q.2
a) Explain waterfall model, what are its limitations ?

Ans :- The Waterfall Model was the first Process Model to be introduced. It is
also referred to as a linear-sequential life cycle model. It is very simple to
understand and use. In a waterfall model, each phase must be completed before
the next phase can begin and there is no overlapping in the phases.

The Waterfall model is the earliest SDLC approach that was used for software
development.

The waterfall Model illustrates the software development process in a linear


sequential flow. This means that any phase in the development process begins
only if the previous phase is complete. In this waterfall model, the phases do not
overlap.
Waterfall approach was first SDLC Model to be used widely in Software
Engineering to ensure success of the project. In "The Waterfall" approach, the
whole process of software development is divided into separate phases. In this
Waterfall model, typically, the outcome of one phase acts as the input for the
next phase sequentially.

The following illustration is a representation of the different phases of the


Waterfall Model.
 Requirement Gathering and analysis − All possible requirements of
the system to be developed are captured in this phase and documented in
a requirement specification document.
 System Design − The requirement specifications from first phase are
studied in this phase and the system design is prepared. This system
design helps in specifying hardware and system requirements and helps in
defining the overall system architecture.
 Implementation − With inputs from the system design, the system is
first developed in small programs called units, which are integrated in the
next phase. Each unit is developed and tested for its functionality, which is
referred to as Unit Testing.
 Integration and Testing − All the units developed in the
implementation phase are integrated into a system after testing of each
unit. Post integration the entire system is tested for any faults and
failures.
 Deployment of system − Once the functional and non-functional testing
is done; the product is deployed in the customer environment or released
into the market.
 Maintenance − There are some issues which come up in the client
environment. To fix those issues, patches are released. Also to enhance
the product some better versions are released. Maintenance is done to
deliver these changes in the customer environment.

Limitations :

 No working software is produced until late during the life cycle.


 High amounts of risk and uncertainty.
 Not a good model for complex and object-oriented projects.
 Poor model for long and ongoing projects.
 Not suitable for the projects where requirements are at a moderate to
high risk of changing. So, risk and uncertainty is high with this process
model.
 It is difficult to measure progress within stages.
 Cannot accommodate changing requirements.
 Adjusting scope during the life cycle can end a project.
 Integration is done as a "big-bang. at the very end, which doesn't allow
identifying any technological or business bottleneck or challenges early.

b) Explain software characteristics.

Ans :-

1. Software does not wear out :


Different things like clothes, shoes, ornaments do wear out after some
time. But, software once created never wears out. It can be used for as
long as needed and in case of need for any updating, required changes
can be made in the same software and then it can be used further with
updated features.
2. Software is not manufactured :
Software is not manufactured but is developed. So, it does not require any
raw material for its development.
3. Usability of Software :
The usability of the software is the simplicity of the software in terms of
the user. The easier the software is to use for the user, the more is the
usability of the software as more number of people will now be able to use
it and also due to the ease will use it more willingly.
4. Reusability of components :
As the software never wears out, neither do its components, i.e. code
segments. So, if any particular segment of code is required in some other
software, we can reuse the existing code form the software in which it is
already present. This reduced our work and also saves time and money.
5. Flexibility of software :
A software is flexible. What this means is that we can make necessary
changes in our software in the future according to the need of that time
and then can use the same software then also.
6. Maintainability of software :
Every software is maintainable. This means that if any errors or bugs
appear in the software, then they can be fixed.
7. Portability of software :
Portability of the software means that we can transfer our software from
one platform to another that too with ease. Due to this, the sharing of the
software among the developers and other members can be done flexibly.
8. Reliability of Software :
This is the ability of the software to provide the desired functionalities
under every condition. This means that our software should work properly
in each condition.

Q.3
a) Explain requirement elicitation techniques for software.

Ans :- Requirements elicitation involves meeting with stakeholders of different


kinds to discover information about the proposed system. You may supplement
this information with knowledge of existing systems and their usage and
information from documents of various kinds. You need to spend time
understanding how people work, what they produce, how they use other
systems, and how they may need to change to accommodate a new system.

There are two fundamental approaches to requirements elicitation :

1. Interviewing, where you talk to people about what they do.


2. Observation or ethnography, where you watch people doing their job to
see what artifacts they use, how they use them, and so on.

Interviewing :

Formal or informal interviews with system stakeholders are part of most


requirements engineering processes. In these interviews, the requirements
engineering team puts questions to stakeholders about the system that they
currently use and the system to be developed. Requirements are derived from
the answers to these questions.

Interviews may be of two types :

1. Closed interviews, where the stakeholder answers a predefined set of


questions.
2. Open interviews, in which there is no predefined agenda. The
requirements
engineering team explores a range of issues with system stakeholders and hence
develops a better understanding of their needs.

Ethnography :

Ethnography is an observational technique that can be used to understand


operational processes and help derive requirements for software to support
these processes. An analyst immerses himself or herself in the working
environment where the system will be used. The day-to-day work is observed,
and notes are made of the actual tasks in which participants are involved. The
value of ethnography is that it helps discover implicit system requirements that
reflect the actual ways that people work, rather than the formal processes
defined by the organization.
Ethnography is particularly effective for discovering two types of requirements :

1. Requirements derived from the way in which people actually work, rather
than the way in which business process definitions say they ought to
work. In practice, people never follow formal processes. For example, air
traffic controllers may switch off a conflict alert system that detects
aircraft with intersecting flight paths, even though normal control
procedures specify that it should be used. The conflict alert system is
sensitive and issues audible warnings even when planes are far apart.
Controllers may find these distracting and prefer to use other strategies to
ensure that planes are not on conflicting flight paths.
2. Requirements derived from cooperation and awareness of other people’s
activities. For example, air traffic controllers (ATCs) may use an
awareness of other controllers work to predict the number of aircraft that
will be entering their control sector. They then modify their control
strategies depending on that predicted workload. Therefore, an automated
ATC system should allow controllers in a sector to have some visibility of
the work in adjacent sectors.

b) With respect to data modeling explain, Data objects, attributes,


relationships cardinality & modality.

Ans :-

1. Data object :
The data object is actually a location or region of storage that contains a
collection of attributes or groups of values that act as an aspect, characteristic,
quality, or descriptor of the object. A vehicle is a data object which can be
defined or described with the help of a set of attributes or data.

Different data objects are present which are shown below:

 External entities such as a printer, user, speakers, keyboard, etc.


 Things such as reports, displays, signals.
 Occurrences or events such as alarm, telephone calls.
 Sales databases such as customers, store items, sales.
 Organizational units such as division, departments.
 Places such as manufacturing floor, workshops.
 Structures such as student records, accounts, files, documents.

2. Attributes :
Attributes define the properties of a data object. The attribute is a quality or
characteristic that defines a person, group, or data objects. It is actually the
properties that define the type of entity. An attribute can have a single or
multiple or range of values as per our needs.

There are three types of attributes:

1. Naming attributes –
To name an instance of a data object, naming attributes are used. User
naming attributes identify user objects such as Login_names and User_Id
for some security purpose. For example- Make and model are naming
attributes in a vehicle data object.
2. Descriptive attributes –
These attributes are used to describe the characteristics or features or the
relationship of the data object. Sometimes also referred to as relationship
attributes. For example- In a vehicle, the color of a data object is a
descriptive attribute that describes the features of the object.
3. Referential attribute –
These are the attributes that are used to formalize binary and associative
relationships and in making reference to another instance in another
table. For example- The data object is a referential attribute in a vehicle.

3. Relationship :

The relationship represents the connection or relation between different data


objects and describes association among entities. Relationships are of three
types: one-to-many, many-to-many, and many-to-one.

For example, toy and shopkeeper are two objects that share the following
relationship:
 The Shopkeeper order toys.
 The shopkeeper sells toys.
 The shopkeeper shows toys.
 The Shopkeeper stocks toys.

4. Cardinality :

The data model must be capable of representing the number of occurrences


objects in a given relationship. Tillmann defines the cardinality of an
object/relationship pair in the following manner:

Cardinality is the specification of the number of occurrences of one [object] that


can be related to the number of occurrences of another [object]. Cardinality is
usually expressed as simply 'one' or 'many.' For example, a husband can have
only one wife (in most cultures), while a parent can have many children. Taking
into consideration all combinations of 'one' and 'many,' two [objects] can be
related as

 One-to-one (l:l)—An occurrence of [object] 'A' can relate to one and


only one occurrence

of [object] 'B,' and an occurrence of 'B' can relate to only one occurrence
of 'A.'

 One-to-many (l:N)—One occurrence of [object] 'A' can relate to one or


many occurrences
of [object] 'B,' but an occurrence of 'B' can relate to only one occurrence
of 'A.'

For example, a mother can have many children, but a child can have only
one mother.
 Many-to-many (M:N)—An occurrence of [object] 'A' can relate to one or
more occurrences

of 'B,' while an occurrence of 'B' can relate to one or more occurrences of


'A.'

5. Modality :

The Modality is completely different from the cardinality. Its value is computed
as “o” when there is no requirement for the relationship to occur or if the
relationship is optional. The modality value is “1” if there is a compulsion for the
occurrence of a relationship. In simple words, it describes whether a relationship
between two or more entities is even required or not.

Q.4
a) What is functional modeling? Explain a data flow model.

Ans :-

 FUNCTIONAL MODELS :

The functional model is the third leg of the OMT methodology in addition to the
Object Model and Dynamic Model. “The functional model specifies the results of
a computation specifying how or when they are computed”. The functional model
specifies the meaning of the operations in the object model and the actions in
the dynamic model, as well as any constraints in the object model.

Non-interactive programs, such as compilers, have a trivial dynamic model; the


purpose of a compiler is to compute a function. The functional model is the main
model for such programs, although the object model is important for any
problem with nontrivial data structures. Many interactive programs also have a
significant functional model. By contrast, databases often have a trivial
functional model, since their purpose is to store and organize data, not to
transform it

For example, spreadsheet is a kind of functional model. In many cases, the


values in the spreadsheet are trivial and cannot be structured further. The only
interesting object structure in the spreadsheet is the cell. The aim of the
spreadsheet is to specify values in terms of other values.

 DATA FLOW MODEL :

the functional model consists of multiple data flow diagrams which specify the
meanings, of operations and constraints. A data flow diagram (DFD) shows the
functional relationships of the values computed by a system, including input
values, output values, and internal data stores. “A data flow diagram is a graph
which shows the flow of data values from their sources in objects through
processes that transform them to their destinations in other objects”. DFDs do
not show control information, such as the time at which processes are executed,
or decisions among alternate data paths. This type of information belongs to the
dynamic model. Also, the arrangement values into object are shown by the
object model, but not by the data flow diagram.

A data flow diagram contains processes which transform data, data flows which
move data, actor objects which produce and consume data, and data store
objects that store data passively. Figure 1 shows a data flow diagram for the
display of an icon on a windowing system. Here in this figure, the icon name and
location are inputs to the diagram from an unspecified source. The icon is
expanded to vectors in the application coordinate system using existing icon
definitions. The vectors are clipped to the size of the window, then offset by the
location of the window on the screen, to obtain vectors in the screen coordinate
system. Finally, the vectors are converted to pixel operations that are sent to the
screen buffer for display. The data flow diagram represents the sequence of
transformations performed, as well as the external values and objects that affect
the computation process.

A data flow connects the output of an object or process to the input of another
object or process. It represents an intermediate data value within a
computation. The value is not changed by the data flow.

Each data flow represents a value at some point in the computation. Flows on
the boundary of a data flow diagram are its inputs and outputs. These flows may
be unconnected (if the diagram is a fragment of a complete system), or they
may be connected to objects.
b) With suitable examples explain modularity functional independence &
refinement.

Ans :-

 Modularity :
When developing software, the software is broken into smaller and smaller
components, into packages of classes, then into the classes themselves, into the
base data-types that make up these classes, into the functions that they call,
and so on.

This ability to divide a software system into discrete portions is called


modularity, which is an important component of abstraction and architectural
design. Having modular software allows it to be more easily comprehended by
the developers and our customers. However, modularity has a drawback: while
increasing modularity can increase our understanding of the software, after a
certain point the software will consist of enough modules that we will again have
a problem seeing how they all interact.

As with any abstraction tool, it is important to choose the right level of


modularity (the right level of abstraction) for the software.

Modularised software is easier to develop and to test; it can more easily


accommodate change, since change should be restricted to only a small number
of modules.

 Functional independence :

Functional independence occurs where modules (such as a package or class)


address a specific and constrained range of functionality. The modules provide
interfaces only to this functionality. By constraining their functionality, the
modules require the help of fewer other modules to carry out their functionality.
The functional independence of a module can be judged using two concepts:
cohesion and coupling: cohesion is the degree to which a module performs only
one function. coupling is the degree to which a module requires other modules
to perform its function.

Having many functionally independent modules helps a software system be


resilient to change: because functionally independent modules rely on fewer
other modules, there is less chance of changes to these modules spreading to
those which are functionally independent.

Functional independence makes modules easier to develop and test. Changes


made to how they perform their function are less likely to affect the software as
a whole.

Functional independence is one of the goals of using information hiding and


modularity. Consider this: there can be no good information hiding if the
software has not been broken into modules. If the software has not been broken
into modules, there can not ever be functionally independent modules. If no
information is hid from other modules of the software, if every module always
depended on all the others to perform its function, any change made to the
software will always result in changes having to be made elsewhere in the
software in order to handle these changes.

 Refinement :
Stepwise refinement is the idea that software is developed by moving through
the levels of abstraction, beginning at higher levels and, incrementally refining
the software through each level of abstraction, providing more detail at each
increment. At higher levels, the software is merely its design models; at lower
levels there will be some code; at the lowest level the software has been
completely developed.

At the early steps of the refinement process the software engineer does not
necessarily know how the software will perform what it needs to do. This is
determined at each successive refinement step, as the design and the software
is elaborated upon.

Refinement can be seen as the compliment of abstraction. Abstraction is


concerned with hiding lower levels of detail; it moves from lower to higher
levels. Refinement is the movement from higher levels of detail to lower levels.
Both concepts are necessary in developing software.

Q.5 Write short notes on the following (any three) :


1) Importance of s/w architecture

Ans :- The importance of software architecture cannot be overemphasized. A


good architecture provides a clear understanding of the system and makes it
easy to develop, maintain, and deploy. It minimizes the cost of the software
development cycle and maximizes the productivity of the developers.

A software architect is a project’s technical leader with experience, knowledge,


and commitment to the tasks involved in software development, as well as an
understanding of the business area and functional and non-functional design
aspects.

A well-architected system can help you avoid repeating code, simplify the
integration of components developed by different teams, and improve the overall
quality and security of the software.

2) Behavior modeling.

Ans :- Behavioral modeling is an approach used by companies to better


understand and predict consumer actions. Behavioral modeling uses available
consumer and business spending data to estimate future behavior in specific
circumstances.

Behavioral modeling is used by financial institutions to estimate the risk


associated with providing funds to an individual or business and by marketing
firms to target advertising.

Companies use behavioral modeling to target offers and advertising to


customers. Banks also use behavioral modeling to create deeper risk profiles of
customer groups.
Behavioral economics also relies on behavioral modeling to predict behaviors of
agent that fall outside of what would be considered entirely fact-based or
rational behavior.

3) Elements of the Analysis model.

Ans :-

1. Scenario based element This type of element represents the system user
point of view. Scenario based elements are use case diagram, user stories.

2. Class based elements The object of this type of element manipulated by the
system. It defines the object, attributes and relationship. The collaboration is
occurring between the classes. Class based elements are the class diagram,
collaboration diagram.

3. Behavioural elements represent state of the system and how it is changed by


the external events. The behavioural elements are sequenced diagram, state
diagram.
4. Flow oriented elements An information flows through a computer-based
system it gets transformed. It shows how the data objects are transformed while
they flow between the various system functions.

The flow elements are data flow diagram, control flow diagram.

4) Software project and challenges.

Ans :-
 Software Project :

A planned and organized effort to develop a software product or system.


The creation of a software solution to meet specific needs or objectives.

Developing a mobile app, creating an enterprise-level software system,


implementing a new software module for an existing system.

 Challenges in Software Projects :

Obstacles or difficulties encountered during the execution of a software project.

Identifying and addressing issues that may hinder project progress and success.

Unclear requirements, scope creep, inadequate communication, resource


constraints, technical challenges, and changing regulations.

Q.6 Solve any five of the following :


a) What do you mean by agile development ?

Ans :- Agile development is an iterative software-development methodology


which teams use in projects. Self-organized, cross-functional teams frequently
analyze circumstances and user needs to adapt projects. Scrum teams
constantly improve quality in sprints with short-term deliverables.

b) What is software project ?

Ans :- A Software Project is the complete procedure of software development


from requirement gathering to testing and maintenance, carried out according to
the execution methodologies, in a specified period of time to achieve intended
software product.

c) What is a metric in SQA ?

Ans :- SQA metrics are numerical values that represent some aspect of software
quality or the SQA process. They can be derived from various sources, such as
code analysis, testing results, defect reports, user feedback, or project
documentation.

d) Define extreme programming.

Ans :- Extreme programming (XP) is one of the most important software


development frameworks of Agile models. It is used to improve software quality
and responsiveness to customer requirements. The extreme programming model
recommends taking the best practices that have worked well in the past in
program development projects to extreme levels.

e) Describe management spectrum in software project planning.


Ans :- The management spectrum describes the management of a software
project or how to make a project successful. It focuses on the four P's; people,
product, process and project. Here, the manager of the project has to control all
these P's to have a smooth flow in the project progress and to reach the goal.

f) What is the difference between quality control and quality assurance


?

Ans :-

Parameters QUALITY ASSURANCE Quality Control (QC)


(QA)
Objective It focuses on providing It focuses on fulfilling
assurance that the the quality requested.
quality requested will be
achieved.
Technique It is the technique of It is the technique to
managing quality. verify quality.
Type of tool It is a managerial tool. It is a corrective tool.
Process It is process oriented. It is product oriented.
Technique type It is a preventive It is a corrective
technique. technique.
Measure type It is a proactive It is a reactive measure.
measure.
Aim It aims to prevent It aims to identify
defects in the system. defects or bugs in the
system.
Time consumption It is a less time- It is a more time-
consuming activity. consuming activity.
Example Verification Validation

g) What do you mean by software project risk ?

Ans :- A possibility of suffering from loss in software development process is


called a software risk. Loss can be anything, increase in production cost,
development of poor quality software, not being able to complete the project on
time.
h) What is the role of estimation models in software engineering ?

Ans :- The practice of estimating how long it will take to accomplish several jobs
and the whole endeavor is known as time estimation for software projects. As a
firm, it's essential to realize that precise time prediction helps in planning,
resource allocation, and realistic project deadline setting.

Q.7
a) Explain agility principles, what is the impact of human factors ?

Ans :-

Agility Principles :

1. Customer satisfaction by rapid delivery of useful software – Agile asks


delivery frequently with more iterations, many small builds are delivered
in iteration process, as it emphasizes that working software over
comprehensive documentation
2. Welcome changing requirements, even late in development – Agile
accepts change of requirement at any stage, as it emphasizes that
responding to change over following a plan
3. Working software is delivered frequently (weeks rather than months) –
Agile believes which will lead to less defects
4. Working software is the principal measure of progress – Agile believes
which will bring good ROI for customer
5. Sustainable development, able to maintain a constant pace – including
test frequently, it requires continuous testing
6. Close, daily co-operation between business people and developers – Agile
advocates collaborative approach, as it emphasizes that customer
collaboration over contract negotiation
7. Face-to-face conversation is the best form of communication – Agile
requires close communication between peoples
8. Projects are built around motivated individuals, who should be trusted –
Agile emphasizes that individuals and interactions over process and tools
9. Continuous attention to technical excellence and good design – Agile
advocates standing on the "shoulders of giant", that will reduce risk and
time to develop
10. Simplicity: the art of maximizing the amount of work not done – Agile
advocates avoidance of things that waste time, that’s why agile products
less documentation works compared to other methodologies
11. Self-organizing teams – Agile believes which will bring higher efficiency
and lower communication cost
12. Regular adaptation to changing circumstances – Agile requires easily
moved, lean, agile, active software processes, to fit the process to the
project
Impact Of Human Factors :
Human factors and social factors have a very strong impact to the success of
software development and final system. The related researches on SE field,
namely Human-Centered Software Engineering (HCSE) and Social Software
Engineering (SSE), concern to the human and social aspects of software
development process. One of the main observations in this field is that the
concepts, principles, and technologies made for social software applications are
applicable to software development itself as SE is inherently a social activity too.
Accordingly, some methods and tools have been proposed to support different
parts of HCSE/SSE, for instance, social system design or social requirements
engineering.

In 2005, Miller et al. discussed that ethical analysis methods and related topics,
which can inform a discussion about software development techniques when
human values and ethical principles are considered, they suggested that all
software engineers should have skill in some kind of ethical analysis, as well as
another two human factors, utilitarian analysis and Literature Review 45
deontological analysis. The former helps a software engineer to think about
consequences for developers, customers, users, and anyone else whose life may
be affected by the software developed. The latter pushes a software developer
into somewhat different emphases [5].

b) Explain extreme programming. What is XP process ?

Ans :- Extreme Programming (XP) is a software development methodology that


aims to enhance the quality of software and its responsiveness to changing
customer requirements. It promotes adaptability over predictability and
emphasizes constant communication, feedback, simplicity, and respect for each
team member. It also incorporates continuous testing and planning for software
that might evolve over time.

Extreme Programming (XP), a software development methodology, is important


because it emphasizes customer satisfaction through continuous delivery of
valuable software. It employs principles such as simplicity, communication,
feedback, and courage, making it a flexible approach adaptable to changing
customer requirements. Furthermore, XP promotes high-quality software
production through its practices including testing, frequent “releases” in short
development cycles, and a close, daily cooperation between business
stakeholders and developers. This allows quick response to changes and reduces
the cost of the changes themselves. As such, Extreme Programming fosters
efficient use of resources, accelerates development, and enhances productivity.
XP Process :

Extreme Programming (XP) is primarily used within the field of software


development to enhance the quality and responsiveness to evolving customer
requirements. As a type of agile software development, it advocates frequent
releases in short development cycles, improving productivity and introducing
checkpoints where new customer requirements can be adopted. It’s a solution-
oriented method that helps programmers provide high-quality, customer-centric
software that meets their ever-changing needs throughout the entire
development process.

The main purposes of Extreme Programming include fostering better


communication among team members, presenting a competitive yet
collaborative working environment, and ensuring high-quality software delivery.
This approach emphasizes communication, simplicity, feedback, and courage.
Through communication, it breaks down barriers between members to
understand and meet requirements. Simplicity supports delivering the simplest
software for today, without unduly complicating future enhancements.

Regular feedback loops concerning the system, design, or process keep the team
aligned and informed of the progress. Courage enables developers to respond to
changing requirements even late into the development cycle, affording
continuous improvements and flexibility.

Q.8
a) Explain the process of software project planning.

Ans :- A Software Project is the complete methodology of programming


advancement from requirement gathering to testing and support, completed by
the execution procedures, in a specified period to achieve intended software
product.

Software development is a sort of all new streams in world business, and there's
next to no involvement in structure programming items. Most programming
items are customized to accommodate customer's necessities. The most
significant is that the underlying technology changes and advances so generally
and rapidly that experience of one element may not be connected to the other
one. All such business and ecological imperatives bring risk in software
development; hence, it is fundamental to manage software projects efficiently.

Software manager is responsible for planning and scheduling project


development. They manage the work to ensure that it is completed to the
required standard. They monitor the progress to check that the event is on time
and within budget. The project planning must incorporate the major issues like
size & cost estimation scheduling, project monitoring, personnel selection
evaluation & risk management. To plan a successful software project, we must
understand :

 Scope of work to be completed


 Risk analysis
 The resources mandatory
 The project to be accomplished
 Record of being followed

Software Project planning starts before technical work start. The various steps of
planning activities are :
The size is the crucial parameter for the estimation of other activities. Resources
requirement are required based on cost and development time. Project schedule
may prove to be very useful for controlling and monitoring the progress of the
project. This is dependent on resources & development time.

b) Explain in detail software scope and feasibility.

Ans :-
Software Scope :

The software scope outlines the boundaries, goals, and deliverables of a software
project. It defines what the software will and will not do, detailing the
functionalities, features, and constraints. The scope is a crucial document that
helps manage expectations, prevent scope creep, and guide the development
team throughout the project.

Software scope delineates the boundaries and objectives of a software project,


serving as a foundational document that outlines what the software will achieve
and the limitations it will adhere to. This comprehensive definition encompasses
both functional and non-functional aspects, detailing user interactions, system
processes, constraints, and dependencies.

It incorporates user stories or use cases to offer a more nuanced understanding


of the software's behavior. Crucially, the software scope manages expectations,
prevents scope creep by explicitly stating inclusions and exclusions, and acts as
a guiding roadmap for development teams. By clarifying objectives, boundaries,
and requirements, the software scope ensures alignment between stakeholders
and facilitates effective project planning.
Importance of Software Scope :
 Prevents Scope Creep : Clearly defining the scope helps prevent the
addition of unplanned features or changes during the development
process.
 Manages Expectations : Stakeholders, including clients and
development teams, have a clear understanding of what to expect from
the software.
 Facilitates Planning : Helps in creating realistic project plans, timelines,
and resource allocations based on the defined scope.
 Guides Development : Acts as a roadmap for the development team,
providing a clear direction for building the software.

Software Feasibility :

A feasibility study is an assessment of the practicality and viability of a proposed


software project. It involves analyzing various aspects to determine whether the
project is technically, economically, and operationally feasible. A feasibility study
is typically conducted before the project officially begins to ensure that it is
worth pursuing.

A feasibility study is a pivotal assessment conducted before embarking on a


software project to gauge its practicality and viability. This multifaceted
analysis encompasses technical, economic, operational, schedule, legal,
regulatory, and social dimensions. Technical feasibility scrutinizes the
availability and reliability of requisite technologies, while economic feasibility
evaluates costs and potential returns on investment. Operational feasibility
assesses the software's integration with existing systems and its impact on
day-to-day operations.

The study also considers legal compliance, societal acceptance, and


adherence to industry regulations. Through this comprehensive evaluation, a
feasibility study identifies risks, aids in cost-benefit analysis, informs
decision-making, and ensures that realistic expectations are set among
stakeholders. It is an essential precursor to successful software development,
offering insights that guide resource planning and risk mitigation strategies.
Importance of Feasibility Study :

 Risk Mitigation : Identifies potential risks and challenges early in the


project, allowing for proactive mitigation strategies.
 Cost-Benefit Analysis : Helps in determining whether the benefits of the
software justify the costs and resources required for development.
 Decision-Making : Provides decision-makers with the information
needed to make informed choices about whether to proceed with the
project.
 Resource Planning : Assists in planning for necessary resources,
including technology, personnel, and budget.
 Realistic Expectations : Ensures that stakeholders have realistic
expectations regarding the project's success and challenges.
Q.9
a) Explain how we can achieve ‘software quality’.

Ans :- Software Quality shows how good and reliable a product is. To convey an
associate degree example, think about functionally correct software. It performs
all functions as laid out in the SRS document. But, it has an associate degree
virtually unusable program. even though it should be functionally correct, we
tend not to think about it to be a high-quality product.

Another example is also that of a product that will have everything that the
users need but has an associate degree virtually incomprehensible and not
maintainable code. Therefore, the normal construct of quality as “fitness of
purpose” for code merchandise isn’t satisfactory.

Factors Of Software Quality :

Here is a new list of things that can be done to improve the quality of software :

1. Strong Plan for Management : Make a detailed plan for quality


assurance that covers the whole process. Define quality engineering tasks
at the start of the project, making sure they fit with the skills of the team
and the needs of the project.
2. Evaluation of the strategic team’s skills : At the start of the project,
do a thorough evaluation of the team’s skills. Find out where the team
might need more training or knowledge to make sure they are ready to
take on quality engineering challenges.
3. Channels of communication that work : Set up clear ways for
everyone on the team to talk to each other. Clear communication makes it
easier for people to work together and makes sure that everyone is on the
same page with quality goals and procedures.
4. Identifying problems ahead of time : Set up ways to find problems
before they happen throughout the whole development process. This
includes finding bugs early on, integrating changes all the time, and using
automated testing to find problems quickly and fix them.
5. Learning and adapting all the time : Promote a culture of always
learning. Keep up with the latest best practices, new technologies, and
changing methods in your field so you can adapt and improve your quality
engineering processes.
6. Integration of Automated Testing : Automated testing should be built
into the development process. Automated tests not only make testing
faster, but they also make sure that evaluations are consistent and can be
done again and again, which raises the quality of software as a whole.
7. Full-Service Checkpoints : Set up checkpoints at important points in the
development process. At these checkpoints, there should be thorough
code reviews, testing, and quality checks to find and fix problems before
they get worse.
8. Adding customer feedback : Ask clients for feedback and use it as part
of the development process. Client feedback helps improve the quality of
software by giving developers useful information about what users want
and how the software will be used in real life.
9. Keep an eye on and improve performance : Set up tools and routines
for monitoring performance all the time. Find possible bottlenecks or
places where the software could be better, and then improve it so that it
meets or exceeds user expectations.
10. Excellence in Documentation : Stress the importance of carefully
writing down the steps used to make and test software. Well-documented
code, test cases, and procedures make things clearer, make it easier to
work together, and make maintenance easier in the future, which
improves the quality of software in the long run.
11. Best Practices for Security : Best practices for security should be
used from the start of the project. Deal with security issues before they
happen by doing things like reviewing the code, checking for
vulnerabilities, and following security standards.
12. Focus on the end-user experience : In the quality engineering
process, put the end-user experience first. Find out what the users want,
test the software’s usability, and make sure it fits their needs and
preferences perfectly.

b) Explain

i) Six sigma for software engineering

Ans :- Six Sigma is the process of producing high and improved quality output.
This can be done in two phases – identification and elimination. The cause of
defects is identified and appropriate elimination is done which reduces variation
in whole processes. A six sigma method is one in which 99.99966% of all the
products to be produced have the same features and are of free from defects.
Characteristics of Six Sigma :
The Characteristics of Six Sigma are as follows :

1. Statistical Quality Control :


Six Sigma is derived from the Greek Letter ? which denote Standard
Deviation in statistics. Standard Deviation is used for measuring the
quality of output.
2. Methodical Approach :
The Six Sigma is a systematic approach of application in DMAIC and
DMADV which can be used to improve the quality of production. DMAIC
means for Design-Measure- Analyze-Improve-Control. While DMADV
stands for Design-Measure-Analyze-Design-Verify.
3. Fact and Data-Based Approach :
The statistical and methodical method shows the scientific basis of the
technique.
4. Project and Objective-Based Focus :
The Six Sigma process is implemented to focus on the requirements and
conditions.
5. Customer Focus :
The customer focus is fundamental to the Six Sigma approach. The quality
improvement and control standards are based on specific customer
requirements.
6. Teamwork Approach to Quality Management :
The Six Sigma process requires organizations to get organized for
improving quality.

ii) ISO 9000 Quality standards.

Ans :- The ISO 9000 series was created by the International Organization for
Standardization (ISO) as international requirements and guidelines for quality
management systems. It was originally introduced in 1987 and over the years
has established itself in the global economy having been adopted in over 178
countries with over one million registrations.

The phrase “ISO 9000 family” or “ISO 9000 series” refers to a group of quality
management standards which are process standards (not product standards).

 ISO 9000 Quality management systems – Fundamentals and Vocabulary,


referenced in all ISO 9000 Standards.
 ISO 9001 Quality management systems – Requirements, contains the
requirements an organization must comply with to become ISO 9001
certified.
 ISO 9002 – Guidelines for the application of ISO 9001:2015
 ISO 9004 – Managing for the sustained success of an organization,
provides guidelines for sustaining QMS success through evaluation and
performance improvement.

The ISO 9000 Series of Quality Standards is not industry-specific and is


applicable to any manufacturing, distribution, or service organization. It is
managed by Technical Committee (TC) 176, comprised of international members
from many industries and backgrounds.
Q.10 Write short notes on the following. (any three) :
1) Agile unified process (AUP).

Ans :- Agile Unified Process (AUP) is a lightweight, iterative, and


adaptable software development methodology that combines the best practices
of agile development with the disciplined approach of the Unified Process (UP).

AUP is designed to deliver high-quality software that meets the changing needs
of the stakeholders in an efficient and effective manner.

Key Principles of Agile Unified Process

The following are the key principles of AUP :

 People and Communication : AUP emphasizes the importance of people


and communication in software development. It promotes face-to-face
communication between stakeholders, customers, and development teams
to ensure that everyone is on the same page.
 Continuous Improvement : AUP is based on the principle of continuous
improvement. It encourages teams to learn from their experiences, and to
continuously improve their processes and practices.
 Collaboration : AUP emphasizes the importance of collaboration between
stakeholders, customers, and development teams. It promotes the use of
collaborative tools and techniques to enhance communication and
collaboration.
 Iterative and Incremental : AUP is an iterative and incremental process
that delivers working software in small, incremental releases. It allows
teams to gather feedback and incorporate changes at each iteration,
which leads to better quality software.
 Risk-Driven : AUP is a risk-driven process that focuses on identifying,
prioritizing, and mitigating risks throughout the development lifecycle. It
encourages teams to manage risks proactively to avoid potential problems
and delays.

2) COCOMO model.

Ans :- The Cocomo Model is a procedural cost estimate model for software
projects and is often used as a process of reliably predicting the various
parameters associated with making a project such as size, effort, cost, time, and
quality. It was proposed by Barry Boehm in 1981 and is based on the study of 63
projects, which makes it one of the best-documented models.

In COCOMO, projects are categorized into three types :

1. Organic : A software project is said to be an organic type if the


team size required is adequately small, the problem is well
understood and has been solved in the past and also the team
members have a nominal experience regarding the problem.
2. Semi-detached : A software project is said to be a Semi-
detached type if the vital characteristics such as team size,
experience, and knowledge of the various programming
environment lie in between that of organic and embedded. The
projects classified as Semi-Detached are comparatively less
familiar and difficult to develop compared to the organic ones
and require more experience and better guidance and
creativity. Eg: Compilers or different Embedded Systems can
be considered Semi-Detached types.
3. Embedded : A software project requiring the highest level of
complexity, creativity, and experience requirement fall under
this category. Such software requires a larger team size than
the other two models and also the developers need to be
sufficiently experienced and creative to develop such complex
models.
Type Of COCOMO Model :

1. Basic COCOMO Model : The basic COCOMO model provide an accurate


size of the project parameters. The following expressions give the basic
COCOMO estimation model :
Effort=a1*(KLOC) a2 PM
Tdev=b1*(efforts)b2 Months

Where,

KLOC is the estimated size of the software product indicate in Kilo Lines
of Code,

a1,a2,b1,b2 are constants for each group of software products,

Tdev is the estimated time to develop the software, expressed in months,

Effort is the total effort required to develop the software product,


expressed in person months (PMs).

2. Intermediate Model : The basic Cocomo model considers that the effort
is only a function of the number of lines of code and some constants
calculated according to the various software systems. The intermediate
COCOMO model recognizes these facts and refines the initial estimates
obtained through the basic COCOMO model by using a set of 15 cost
drivers based on various attributes of software engineering.

3. Detailed COCOMO Model : Detailed COCOMO incorporates all qualities of


the standard version with an assessment of the cost driver’s effect on
each method of the software engineering process. The detailed model
uses various effort multipliers for each cost driver property. In detailed
cocomo, the whole software is differentiated into multiple modules, and
then we apply COCOMO in various modules to estimate effort and then
sum the effort.

3) RMMM plan.

Ans :- A risk management technique is usually seen in the software Project plan.
This can be divided into Risk Mitigation, Monitoring, and Management Plan
(RMMM). In this plan, all works are done as part of risk analysis. As part of the
overall project plan project manager generally uses this RMMM plan.

In some software teams, risk is documented with the help of a Risk Information
Sheet (RIS). This RIS is controlled by using a database system for easier
management of information i.e. creation, priority ordering, searching, and other
analysis. After documentation of RMMM and start of a project, risk mitigation
and monitoring steps will start.

 Risk Mitigation :
It is an activity used to avoid problems (Risk Avoidance).
Steps for mitigating the risks as follows.
1. Finding out the risk.
2. Removing causes that are the reason for risk creation.
3. Controlling the corresponding documents from time to time.
4. Conducting timely reviews to speed up the work.
 Risk Monitoring :
It is an activity used for project tracking.
It has the following primary objectives as follows.
1. To check if predicted risks occur or not.
2. To ensure proper application of risk aversion steps defined for risk.
3. To collect data for future risk analysis.
4. To allocate what problems are caused by which risks throughout the
project.
 Risk Management and Planning :
It assumes that the mitigation activity failed and the risk is a reality. This
task is done by Project manager when risk becomes reality and causes
severe problems. If the project manager effectively uses project
mitigation to remove risks successfully then it is easier to manage the
risks.
This shows that the response that will be taken for each risk by a
manager. The main objective of the risk management plan is the risk
register. This risk register describes and focuses on the predicted threats
to a software project.

4) SQA plan.

Ans :-

 Software Quality Assurance (SQA) plan :


A Software Quality Assurance Plan revolves around making sure that the
product or service teaches the market trouble and bug-free. It should also
meet the requirements defined in the SRS (software requirement
specification).

The purpose of an SAQ plan is three-fold. It comprises the following :

 Establishing the QA responsibilities of the team in question


 Listing areas of concern that need to be reviewed, audited and looked at
 Identifies the SQA work products

SQA strives to encompass all software development processes and activities,


from defining requirements, coding, debugging, and all other activities until
release. As the name suggests, it focuses on preserving and delivering quality
for a software product.

An SQA plan will work alongside the standard development, prototyping, design,
production, and release cycle for a software product or service. For easy
documentation and referencing, an SQA plan will have different sections like
purpose, references, configuration and management, problem reporting and
corrective actions, tools, code controls, testing methodology, and more.

You might also like