You are on page 1of 70

Software Processes

The term software specifies to the set of computer programs, procedures and
associated documents (Flowcharts, manuals, etc.) that describe the program and how
they are to be used.

A software process is the set of activities and associated outcome that produce a
software product. Software engineers mostly carry out these activities. These are four
key process activities, which are common to all software processes. These activities
are:

1. Software specifications: The functionality of the software and constraints on its


operation must be defined.
2. Software development: The software to meet the requirement must be
produced.
3. Software validation: The software must be validated to ensure that it does what
the customer wants.
4. Software evolution: The software must evolve to meet changing client needs.

The Software Process Model


A software process model is a specified definition of a software process, which is
presented from a particular perspective. Models, by their nature, are a simplification, so
a software process model is an abstraction of the actual process, which is being
described. Process models may contain activities, which are part of the software
process, software product, and the roles of people involved in software engineering.
Some examples of the types of software process models that may be produced are:

1. A workflow model: This shows the series of activities in the process along with
their inputs, outputs and dependencies. The activities in this model perform
human actions.
2. 2. A dataflow or activity model: This represents the process as a set of
activities, each of which carries out some data transformations. It shows how the
input to the process, such as a specification is converted to an output such as a
design. The activities here may be at a lower level than activities in a workflow
model. They may perform transformations carried out by people or by computers.
3. 3. A role/action model: This means the roles of the people involved in the
software process and the activities for which they are responsible.

There are several various general models or paradigms of software development:

1. The waterfall approach: This takes the above activities and produces them as
separate process phases such as requirements specification, software design,
implementation, testing, and so on. After each stage is defined, it is "signed off"
and development goes onto the following stage.
2. Evolutionary development: This method interleaves the activities of
specification, development, and validation. An initial system is rapidly developed
from a very abstract specification.
3. Formal transformation: This method is based on producing a formal
mathematical system specification and transforming this specification, using
mathematical methods to a program. These transformations are 'correctness
preserving.' This means that you can be sure that the developed programs meet
its specification.
4. System assembly from reusable components: This method assumes the
parts of the system already exist. The system development process target on
integrating these parts rather than developing them from scratch.

Waterfall model
Winston Royce introduced the Waterfall Model in 1970.This model has five phases:
Requirements analysis and specification, design, implementation, and unit testing,
integration and system testing, and operation and maintenance. The steps always
follow in this order and do not overlap. The developer must complete every phase
before the next phase begins. This model is named "Waterfall Model", because its
diagrammatic representation resembles a cascade of waterfalls.

1. Requirements analysis and specification phase: The aim of this phase is to


understand the exact requirements of the customer and to document them properly.
Both the customer and the software developer work together so as to document all the
functions, performance, and interfacing requirement of the software. It describes the
"what" of the system to be produced and not "how."In this phase, a large document
called Software Requirement Specification (SRS) document is created which
contained a detailed description of what the system will do in the common language.
2. Design Phase: This phase aims to transform the requirements gathered in the SRS
into a suitable form which permits further coding in a programming language. It defines
the overall software architecture together with high level and detailed design. All this
work is documented as a Software Design Document (SDD).

3. Implementation and unit testing: During this phase, design is implemented. If the
SDD is complete, the implementation or coding phase proceeds smoothly, because all
the information needed by software developers is contained in the SDD.

During testing, the code is thoroughly examined and modified. Small modules are tested
in isolation initially. After that these modules are tested by writing some overhead code
to check the interaction between these modules and the flow of intermediate output.

4. Integration and System Testing: This phase is highly crucial as the quality of the
end product is determined by the effectiveness of the testing carried out. The better
output will lead to satisfied customers, lower maintenance costs, and accurate results.
Unit testing determines the efficiency of individual modules. However, in this phase, the
modules are tested for their interactions with each other and with the system.

5. Operation and maintenance phase: Maintenance is the task performed by every


user once the software has been delivered to the customer, installed, and operational.

When to use SDLC Waterfall Model?


Some Circumstances where the use of the Waterfall model is most suited are:

o When the requirements are constant and not changed regularly.


o A project is short
o The situation is calm
o Where the tools and technology used is consistent and is not changing
o When resources are well prepared and are available to use.

Advantages of Waterfall model


o This model is simple to implement also the number of resources that are required
for it is minimal.
o The requirements are simple and explicitly declared; they remain unchanged
during the entire project development.
o The start and end points for each phase is fixed, which makes it easy to cover
progress.
o The release date for the complete product, as well as its final cost, can be
determined before development.
o It gives easy to control and clarity for the customer due to a strict reporting
system.

Disadvantages of Waterfall model


o In this model, the risk factor is higher, so this model is not suitable for more
significant and complex projects.
o This model cannot accept the changes in requirements during development.
o It becomes tough to go back to the phase. For example, if the application has
now shifted to the coding phase, and there is a change in requirement, It
becomes tough to go back and change it.
o Since the testing done at a later stage, it does not allow identifying the
challenges and risks in the earlier phase, so the risk reduction strategy is difficult
to prepare.

V-Model
V-Model also referred to as the Verification and Validation Model. In this, each phase of
SDLC must complete before the next phase starts. It follows a sequential design
process same as the waterfall model. Testing of the device is planned in parallel with a
corresponding stage of development.

Verification: It involves a static analysis method (review) done without executing code.
It is the process of evaluation of the product development process to find whether
specified requirements meet.

Validation: It involves dynamic analysis method (functional, non-functional), testing is


done by executing code. Validation is the process to classify the software after the
completion of the development process to determine whether the software meets the
customer expectations and requirements.

So V-Model contains Verification phases on one side of the Validation phases on the
other side. Verification and Validation process is joined by coding phase in V-shape.
Thus it is known as V-Model.

There are the various phases of Verification Phase of V-model:

1. Business requirement analysis: This is the first step where product


requirements understood from the customer's side. This phase contains detailed
communication to understand customer's expectations and exact requirements.
2. System Design: In this stage system engineers analyze and interpret the
business of the proposed system by studying the user requirements document.
3. Architecture Design: The baseline in selecting the architecture is that it should
understand all which typically consists of the list of modules, brief functionality of
each module, their interface relationships, dependencies, database tables,
architecture diagrams, technology detail, etc. The integration testing model is
carried out in a particular phase.
4. Module Design: In the module design phase, the system breaks down into small
modules. The detailed design of the modules is specified, which is known as
Low-Level Design
5. Coding Phase: After designing, the coding phase is started. Based on the
requirements, a suitable programming language is decided. There are some
guidelines and standards for coding. Before checking in the repository, the final
build is optimized for better performance, and the code goes through many code
reviews to check the performance.

There are the various phases of Validation Phase of V-model:

1. Unit Testing: In the V-Model, Unit Test Plans (UTPs) are developed during the
module design phase. These UTPs are executed to eliminate errors at code level
or unit level. A unit is the smallest entity which can independently exist, e.g., a
program module. Unit testing verifies that the smallest entity can function
correctly when isolated from the rest of the codes/ units.
2. Integration Testing: Integration Test Plans are developed during the
Architectural Design Phase. These tests verify that groups created and tested
independently can coexist and communicate among themselves.
3. System Testing: System Tests Plans are developed during System Design
Phase. Unlike Unit and Integration Test Plans, System Tests Plans are
composed by the client?s business team. System Test ensures that expectations
from an application developer are met.
4. Acceptance Testing: Acceptance testing is related to the business requirement
analysis part. It includes testing the software product in user atmosphere.
Acceptance tests reveal the compatibility problems with the different systems,
which is available within the user atmosphere. It conjointly discovers the non-
functional problems like load and performance defects within the real user
atmosphere.

When to use V-Model?

o When the requirement is well defined and not ambiguous.


o The V-shaped model should be used for small to medium-sized projects where
requirements are clearly defined and fixed.
o The V-shaped model should be chosen when sample technical resources are
available with essential technical expertise.

Advantage (Pros) of V-Model:

1. Easy to Understand.
2. Testing Methods like planning, test designing happens well before coding.
3. This saves a lot of time. Hence a higher chance of success over the waterfall
model.
4. Avoids the downward flow of the defects.
5. Works well for small plans where requirements are easily understood.

Disadvantage (Cons) of V-Model:

1. Very rigid and least flexible.


2. Not a good for a complex project.
3. Software is developed during the implementation stage, so no early prototypes of
the software are produced.
4. If any changes happen in the midway, then the test documents along with the
required documents, has to be updated.

Incremental Model
Incremental Model is a process of software development where requirements divided
into multiple standalone modules of the software development cycle. In this model, each
module goes through the requirements, design, implementation and testing phases.
Every subsequent release of the module adds function to the previous release. The
process continues until the complete system achieved.

The various phases of incremental model are as


follows:
1. Requirement analysis: In the first phase of the incremental model, the product
analysis expertise identifies the requirements. And the system functional requirements
are understood by the requirement analysis team. To develop the software under the
incremental model, this phase performs a crucial role.
2. Design & Development: In this phase of the Incremental model of SDLC, the design
of the system functionality and the development method are finished with success.
When software develops new practicality, the incremental model uses style and
development phase.

3. Testing: In the incremental model, the testing phase checks the performance of each
existing function as well as additional functionality. In the testing phase, the various
methods are used to test the behavior of each task.

4. Implementation: Implementation phase enables the coding phase of the


development system. It involves the final coding that design in the designing and
development phase and tests the functionality in the testing phase. After completion of
this phase, the number of the product working is enhanced and upgraded up to the final
system product

When we use the Incremental Model?


o When the requirements are superior.
o A project has a lengthy development schedule.
o When Software team are not very well skilled or trained.
o When the customer demands a quick release of the product.
o You can develop prioritized requirements first.

Advantage of Incremental Model


o Errors are easy to be recognized.
o Easier to test and debug
o More flexible.
o Simple to manage risk because it handled during its iteration.
o The Client gets important functionality early.

Disadvantage of Incremental Model


o Need for good planning
o Total Cost is high.
o Well defined module interfaces are needed.
Spiral Model
The spiral model, initially proposed by Boehm, is an evolutionary software process
model that couples the iterative feature of prototyping with the controlled and systematic
aspects of the linear sequential model. It implements the potential for rapid development
of new versions of the software. Using the spiral model, the software is developed in a
series of incremental releases. During the early iterations, the additional release may be
a paper model or prototype. During later iterations, more and more complete versions of
the engineered system are produced.

The Spiral Model is shown in fig:

Each cycle in the spiral is divided into four parts:

Objective setting: Each cycle in the spiral starts with the identification of purpose for
that cycle, the various alternatives that are possible for achieving the targets, and the
constraints that exists.
Risk Assessment and reduction: The next phase in the cycle is to calculate these
various alternatives based on the goals and constraints. The focus of evaluation in this
stage is located on the risk perception for the project.

Development and validation: The next phase is to develop strategies that resolve
uncertainties and risks. This process may include activities such as benchmarking,
simulation, and prototyping.

Planning: Finally, the next step is planned. The project is reviewed, and a choice made
whether to continue with a further period of the spiral. If it is determined to keep, plans
are drawn up for the next step of the project.

The development phase depends on the remaining risks. For example, if performance
or user-interface risks are treated more essential than the program development risks,
the next phase may be an evolutionary development that includes developing a more
detailed prototype for solving the risks.

The risk-driven feature of the spiral model allows it to accommodate any mixture of a
specification-oriented, prototype-oriented, simulation-oriented, or another type of
approach. An essential element of the model is that each period of the spiral is
completed by a review that includes all the products developed during that cycle,
including plans for the next cycle. The spiral model works for development as well as
enhancement projects.

When to use Spiral Model?

o When deliverance is required to be frequent.


o When the project is large
o When requirements are unclear and complex
o When changes may require at any time
o Large and high budget projects

Advantages

o High amount of risk analysis


o Useful for large and mission-critical projects.

Disadvantages

o Can be a costly model to use.


o Risk analysis needed highly particular expertise
o Doesn't work well for smaller projects.

Evolutionary Process Model


Evolutionary process model resembles the iterative enhancement model. The same
phases are defined for the waterfall model occurs here in a cyclical fashion. This model
differs from the iterative enhancement model in the sense that this does not require a
useful product at the end of each cycle. In evolutionary development, requirements are
implemented by category rather than by priority.

For example, in a simple database application, one cycle might implement the graphical
user Interface (GUI), another file manipulation, another queries and another updates. All
four cycles must complete before there is a working product available. GUI allows the
users to interact with the system, file manipulation allow the data to be saved and
retrieved, queries allow user to get out of the system, and updates allows users to put
data into the system.

Benefits of Evolutionary Process Model


Use of EVO brings a significant reduction in risk for software projects.

EVO can reduce costs by providing a structured, disciplined avenue for


experimentation.

EVO allows the marketing department access to early deliveries, facilitating the
development of documentation and demonstration.

Better fit the product to user needs and market requirements.

Manage project risk with the definition of early cycle content.

Uncover key issues early and focus attention appropriately.

Increase the opportunity to hit market windows.

Accelerate sales cycles with early customer exposure.

Increase management visibility of project progress.

Increase product team productivity and motivations.

Prototype Model
The prototype model requires that before carrying out the development of actual
software, a working prototype of the system should be built. A prototype is a toy
implementation of the system. A prototype usually turns out to be a very crude version
of the actual system, possible exhibiting limited functional capabilities, low reliability,
and inefficient performance as compared to actual software. In many instances, the
client only has a general view of what is expected from the software product. In such a
scenario where there is an absence of detailed information regarding the input to the
system, the processing needs, and the output requirement, the prototyping model may
be employed.

Steps of Prototype Model

1. Requirement Gathering and Analyst


2. Quick Decision
3. Build a Prototype
4. Assessment or User Evaluation
5. Prototype Refinement
6. Engineer Product

Advantage of Prototype Model

1. Reduce the risk of incorrect user requirement


2. Good where requirement are changing/uncommitted
3. Regular visible process aids management
4. Support early product marketing
5. Reduce Maintenance cost.
6. Errors can be detected much earlier as the system is made side by side.

Disadvantage of Prototype Model

1. An unstable/badly implemented prototype often becomes the final product.


2. Require extensive customer collaboration
o Costs customer money
o Needs committed customer
o Difficult to finish if customer withdraw
o May be too customer specific, no broad market

3. Difficult to know how long the project will last.


4. Easy to fall back into the code and fix without proper requirement analysis, design,
customer evaluation, and feedback.
5. Prototyping tools are expensive.
6. Special tools & techniques are required to build a prototype.
7. It is a time-consuming process.

Agile Model
The meaning of Agile is swift or versatile."Agile process model" refers to a software
development approach based on iterative development. Agile methods break tasks into
smaller iterations, or parts do not directly involve long term planning. The project scope
and requirements are laid down at the beginning of the development process. Plans
regarding the number of iterations, the duration and the scope of each iteration are
clearly defined in advance.

Each iteration is considered as a short time "frame" in the Agile process model, which
typically lasts from one to four weeks. The division of the entire project into smaller parts
helps to minimize the project risk and to reduce the overall project delivery time
requirements. Each iteration involves a team working through a full software
development life cycle including planning, requirements analysis, design, coding, and
testing before a working product is demonstrated to the client.
Phases of Agile Model:
Following are the phases in the Agile model are as follows:

1. Requirements gathering
2. Design the requirements
3. Construction/ iteration
4. Testing/ Quality assurance
5. Deployment
6. Feedback

1. Requirements gathering: In this phase, you must define the requirements. You
should explain business opportunities and plan the time and effort needed to build the
project. Based on this information, you can evaluate technical and economic feasibility.

2. Design the requirements: When you have identified the project, work with
stakeholders to define requirements. You can use the user flow diagram or the high-
level UML diagram to show the work of new features and show how it will apply to your
existing system.

3. Construction/ iteration: When the team defines the requirements, the work begins.
Designers and developers start working on their project, which aims to deploy a working
product. The product will undergo various stages of improvement, so it includes simple,
minimal functionality.

4. Testing: In this phase, the Quality Assurance team examines the product's
performance and looks for the bug.

5. Deployment: In this phase, the team issues a product for the user's work
environment.

6. Feedback: After releasing the product, the last step is feedback. In this, the team
receives feedback about the product and works through the feedback.

Agility Principles:
1. Our highest priority is to satisfy the customer through early and continuous delivery of
valuable software.

2. Welcome changing requirements, even late in development. Agile processes harness


change for the customer's competitive advantage.

3. Deliver working software frequently, from a couple of weeks to a couple of months,


with a preference to the shorter timescale.

4. Business people and developers must work together daily throughout the project.

5. Build projects around motivated individuals. Give them the environment and support
they need, and trust them to get the job done.

6. The most efficient and effective method of conveying information to and within a
development team is face–to–face conversation.

7. Working software is the primary measure of progress.

8. Agile processes promote sustainable development. The sponsors, developers, and


users should be able to maintain a constant pace indefinitely.

9. Continuous attention to technical excellence and good design enhances agility.

10. Simplicity – the art of maximizing the amount of work not done – is essential.
11. The best architectures, requirements, and designs emerge from self–organizing
teams.

12. At regular intervals, the team reflects on how to become more effective, then tunes
and adjusts its behavior accordingly.

Agile Testing Methods:


o Scrum
o Crystal
o Dynamic Software Development Method(DSDM)
o Feature Driven Development(FDD)
o Lean Software Development
o extreme Programming(XP)

Scrum
SCRUM is an agile development process focused primarily on ways to manage tasks in
team-based development conditions.

There are three roles in it, and their responsibilities are:

o Scrum Master: The scrum can set up the master team, arrange the meeting and
remove obstacles for the process
o Product owner: The product owner makes the product backlog, prioritizes the delay
and is responsible for the distribution of functionality on each repetition.
o Scrum Team: The team manages its work and organizes the work to complete the
sprint or cycle.

extreme Programming(XP)
This type of methodology is used when customers are constantly changing demands or
requirements, or when they are not sure about the system's performance.

Crystal:
There are three concepts of this method-
1. Chartering: Multi activities are involved in this phase such as making a development
team, performing feasibility analysis, developing plans, etc.
2. Cyclic delivery: under this, two more cycles consist, these are:

A. Team updates the release plan.

B. Integrated product delivers to the users.

3. Wrap up: According to the user environment, this phase performs deployment, post-
deployment.

Dynamic Software Development Method (DSDM):


DSDM is a rapid application development strategy for software development and gives
an agile project distribution structure. The essential features of DSDM are that users
must be actively connected, and teams have been given the right to make decisions.
The techniques used in DSDM are:

1. Time Boxing
2. MoSCoW Rules
3. Prototyping

The DSDM project contains seven stages:

1. Pre-project
2. Feasibility Study
3. Business Study
4. Functional Model Iteration
5. Design and build Iteration
6. Implementation
7. Post-project

Feature Driven Development(FDD):


This method focuses on "Designing and Building" features. In contrast to other smart
methods, FDD describes the small steps of the work that should be obtained separately
per function.
Lean Software Development:
Lean software development methodology follows the principle "just in time production."
The lean method indicates the increasing speed of software development and reducing
costs. Lean development can be summarized in seven phases.

1. Eliminating Waste
2. Amplifying learning
3. Defer commitment (deciding as late as possible)
4. Early delivery
5. Empowering the team
6. Building Integrity
7. Optimize the whole

When to use the Agile Model?


o When frequent changes are required.
o When a highly qualified and experienced team is available.
o When a customer is ready to have a meeting with a software team all the time.
o When project size is small.

Advantage (Pros) of Agile Method:


1. Frequent Delivery
2. Face-to-Face Communication with clients.
3. Efficient design and fulfils the business requirement.
4. Anytime changes are acceptable.
5. It reduces total development time.

Disadvantages (Cons) of Agile Model:


1. Due to the shortage of formal documents, it creates confusion and crucial decisions
taken throughout various phases can be misinterpreted at any time by different team
members.
2. Due to the lack of proper documentation, once the project completes and the developers
allotted to another project, maintenance of the finished project can become a difficulty.

Data modeling
What is data modeling?
Data modeling is the process of creating a simplified diagram of a software
system and the data elements it contains, using text and symbols to represent
the data and how it flows. Data models provide a blueprint for designing a new
database or reengineering a legacy application. Overall, data modeling helps
an organization use its data effectively to meet business needs for
information.

A data model can be thought of as a flowchart that illustrates data entities,


their attributes and the relationships between entities. It enables data
management and analytics teams to document data requirements for
applications and identify errors in development plans before any code is
written.

Alternatively, data models can be created through reverse-engineering efforts


that extract them from existing systems. That's done to document the
structure of relational databases that were built on an ad hoc basis without
upfront data modeling and to define schemas for sets of raw data stored
in data lakes or NoSQL databases to support specific analytics applications.

Why is data modeling done?


Data modeling is a core data management discipline. By providing a visual
representation of data sets and their business context, it helps pinpoint
information needs for different business processes. It then specifies the
characteristics of the data elements that will be included in applications and in
the database or file system structures used to process, store and manage the
data.

Data modeling can also help establish common data definitions and internal
data standards, often in connection with data governance programs. In
addition, it plays a big role in data architecture processes that document data
assets, map how data moves through IT systems and create a conceptual
data management framework. Data models are a key data architecture
component, along with data flow diagrams, architectural blueprints, a unified
data vocabulary and other artifacts.

Traditionally, data models have been built by data modelers, data


architects and other data management professionals with input from business
analysts, executives and users. But data modeling is also now an important
skill for data scientists and analysts involved in developing business
intelligence applications and more complex data science and advanced
analytics ones.

What are the different types of data models?


Data modelers use three types of models to separately represent business
concepts and workflows, relevant data entities and their attributes and
relationships, and technical structures for managing the data. The models
typically are created in a progression as organizations plan new applications
and databases. These are the different types of data models and what they
include:

1. Conceptual data model. This is a high-level visualization of the business


or analytics processes that a system will support. It maps out the kinds of
data that are needed, how different business entities interrelate and
associated business rules. Business executives are the main audience for
conceptual data models, to help them see how a system will work and
ensure that it meets business needs. Conceptual models aren't tied to
specific database or application technologies.

2. Logical data model. Once a conceptual data model is finished, it can be


used to create a less-abstract logical one. Logical data models show how
data entities are related and describe the data from a technical
perspective. For example, they define data structures and provide details
on attributes, keys, data types and other characteristics. The technical side
of an organization uses logical models to help understand required
application and database designs. But like conceptual models, they aren't
connected to a particular technology platform.

3. Physical data model. A logical model serves as the basis for the creation
of a physical data model. Physical models are specific to the database
management system (DBMS) or application software that will be
implemented. They define the structures that the database or a file system
will use to store and manage the data. That includes tables, columns,
fields, indexes, constraints, triggers and other DBMS elements. Database
designers use physical data models to create designs and generate
schema for databases.

These three types of data models fit together as part of the overall modeling process.
Data modeling techniques
Data modeling emerged in the 1960s as databases became more widely used
on mainframes and then minicomputers. It enabled organizations to bring
consistency, repeatability and disciplined development to data processing and
management. That's still the case, but the techniques used to create data
models have evolved along with the development of new types of databases
and computer systems.

These are the data modeling approaches used most widely over the years,
including several that have largely been supplanted by newer techniques.

1. Hierarchical data modeling


Hierarchical data models organize data in a treelike arrangement of parent
and child records. A child record can have only one parent, making this a one-
to-many modeling method. The hierarchical approach originated in mainframe
databases -- IBM's Information Management System (IMS) is the best-known
example. Although hierarchical data models were mostly superseded by
relational ones beginning in the 1980s, IMS is still available and used by many
organizations. A similar hierarchical method is also used today in XML,
formally known as Extensible Markup Language.

2. Network data modeling


This was also a popular data modeling option in mainframe databases that
isn't used as much now. Network data models expanded on hierarchical ones
by allowing child records to be connected to multiple parent records. The
Conference on Data Systems Languages, a now-defunct technical standards
group commonly called CODASYL, adopted a network data model
specification in 1969. Because of that, the network technique is often referred
to as the CODASYL model.
3. Relational data modeling
The relational data model was created as a more flexible alternative to
hierarchical and network ones. First described in a 1970 technical paper by
IBM researcher Edgar F. Codd, the relational model maps the relationships
between data elements stored in different tables that contain sets of rows and
columns. Relational modeling set the stage for the development of relational
databases, and their widespread use made it the dominant data modeling
technique by the mid-1990s.

4. Entity-relationship data modeling


A variation of the relational model that can also be used with other types of
databases, entity-relationship (ER) models visually map entities, their
attributes and the relationships between different entities. For example, the
attributes of an employee data entity could include last name, first name,
years employed and other relevant data. ER models provide an efficient
approach for data capture and update processes, making them particularly
suitable for transaction processing applications.

RICK SHERMAN, ATHENA IT SOLUTIONS


This is an entity-relationship data model created from Microsoft's AdventureWorks sample
database.
5. Dimensional data modeling
Dimensional data models are primarily used in data warehouses and data
marts that support business intelligence applications. They consist of fact
tables that contain data about transactions or other events and dimension
tables that list attributes of the entities in the fact tables. For example, a fact
table could detail product purchases by customers, while connected
dimension tables hold data about the products and customers. Notable types
of dimensional models are star schemas, which connect a fact table to
different dimension tables, and snowflake schemas, which include multiple
levels of dimension tables.
RICK SHERMAN, ATHENA IT SOLUTIONS
This dimensional data model was built from Microsoft's AdventureWorks sample database.
6. Object-oriented data modeling
As object-oriented programming advanced in the 1990s and software vendors
developed object databases, object-oriented data modeling also emerged.
The object-oriented approach is similar to the ER method in how it represents
data, attributes and relationships, but it abstracts entities into objects. Different
objects that have the same attributes and behaviors can be grouped into
classes, and new classes can inherit the attributes and behaviors of existing
ones. But object databases remain a niche technology for particular
applications, which has limited the use of object-oriented modeling.

NEO4J
This is an example of a graph data model with nodes connected by edges.

7. Graph data modeling


The graph data model is a more modern offshoot of network and hierarchical
models. Typically paired with graph databases, it's often used to describe data
sets that contain complex relationships. For example, graph data modeling is
a popular approach in social networks, recommendation engines and fraud
detection applications. Property graph data models are a common type -- in
them, nodes that represent data entities and document their properties are
connected by relationships, also known as edges or links, that define how
different nodes are related to one another.
What is the data modeling process?
Ideally, conceptual, logical and physical data models are created in a
sequential process that involves members of the data management team and
business users. Input from business executives and workers is especially
important during the conceptual and logical modeling phases. Otherwise, the
data models may not fully capture the business context of data or meet an
organization's information needs.

Typically, a data modeler or data architect initiates a modeling project by


interviewing business stakeholders to gather requirements and details about
business processes. Business analysts may also help design both the
conceptual and logical models. At the end of the project, the physical data
model is used to communicate specific technical requirements to database
designers.

Peter Aiken, a data management consultant and associate professor of


information systems at Virginia Commonwealth University, listed the following
six steps for designing a data model during a 2019 Dataversity webinar:

 Identify the business entities that are represented in the data set.

 Identify key properties for each entity to differentiate between them.

 Create a draft entity-relationship model to show how entities are


connected.

 Identify the data attributes that need to be incorporated into the model.

 Map the attributes to entities to illustrate the data's business meaning.

 Finalize the data model and validate its accuracy.

Even after that, the process typically isn't finished: Data models often must be
updated and revised as an organization's data assets and business needs
change.
These are six steps to take when designing a data model.

Benefits and challenges of data modeling


Well-designed data models help an organization develop and implement a
data strategy that takes full advantage of its data. Effective data modeling also
helps ensure that individual databases and applications include the right data
and are designed to meet business requirements on data processing and
management.

Other benefits that data modeling provides include the following:

 Internal agreement on data definitions and standards. Data modeling


supports efforts to standardize data definitions, terminology, concepts and
formats enterprise-wide.
 Increased involvement in data management by business
users. Because data modeling requires business input, it encourages
collaboration between data management teams and business
stakeholders, which ideally results in better systems.

 More efficient database design at a lower cost. By giving database


designers a detailed blueprint to work from, data modeling streamlines
their work and reduces the risk of design missteps that require revisions
later in the process.

 Better use of available data assets. Ultimately, good data modeling


enables organizations to use their data more productively, which can lead
to better business performance, new business opportunities and
competitive advantages over rival companies.

However, data modeling is a complicated process that can be difficult to do


successfully. These are some of the common challenges that can send data
modeling projects off track:

 A lack of organizational commitment and business buy-in. If corporate


and business executives aren't on board about the need for data modeling,
it's hard to get the required level of business participation. That means data
management teams must secure executive support upfront.

 A lack of understanding by business users. Even if business


stakeholders are fully committed, data modeling is an abstract process that
can be hard for people to grasp. To help avoid that, conceptual and logical
data models should be based on business terminology and concepts.

 Modeling complexity and scope creep. Data models often are big and
complex, and modeling projects can become unwieldy if teams continue to
create new iterations without finalizing the designs. It's important to set
priorities and stick to an achievable project scope.
 Undefined or unclear business requirements. Particularly with new
applications, the business side may not have fully formed information
needs. Data modelers often must ask a series of questions to gather or
clarify requirements and identify the necessary data.

Software Engineering | Information


System Life Cycle
IN a large organisation, the database system is typically part of the information
system which includes all the resources that are involved in the collection,
management, use and dissemination of the information resources of the organisation. In
the today’s world these resource includes the data itself, DBMS software, the computer
system software and storage media, the person who uses and manages the data and
the application programmers who develop these application. Thus the database system
is a part of much larger organizational information system.
In this article we will discuss about typical life cycle of an information system, and how
the database fits into this life cycle. Information cycle is also known as Macro life cycle.
These cycle typically includes following phases:
1. Feasibility Analysis –
This phase basically concerned with following points:
 (a) Analyzing potential application areas.
 (b) Identifying the economics of information gathering.
 (c) Performing preliminary cost benefit studies.
 (d) Determining the complexity of data and processes.
 (e) Setting up priorities among application.
2. Requirements Collection and Analysis –
In this phase we basically do the following points:
 (a) Detailed requirements are collected by interacting with potential users
and groups to identify their particular problems and needs.
 (b) Inter application dependencies are identified.
 (c) Communication and reporting procedures are identified.
3. Design
This phase has following two aspects:
 (a) Design of database
 (b) Design of application system that uses and process the database.
4. Implementation –
In this phase following steps are implemented:
 (a) The information system is implemented
 (b) The database is loaded.
 (c) The database transaction are implemented and tested.
5. Validation and Acceptance Testing –
The acceptability of the system is meeting’s users requirements and performance
criteria is validated. The system is tested against performance criteria and
behavior specification.
6. Deployment operation and maintenance –
This may be preceded by conversion of users from older system as well as by
user training. The operational phase starts when all system function are
operational and have been validated.As new requirements or application crop up,
they pass through all the previous phases until they are validated and
incorporated into system. Monitoring and system maintenance are important
activities during operational phase.

Types of Software Testing


Testing is the process of executing a program to find errors. To make our software
perform well it should be error-free. If testing is done successfully it will remove all the
errors from the software.

Principles of Testing:-

(i) All the tests should meet the customer requirements.


(ii) To make our software testing should be performed by a third party.
(iii) Exhaustive testing is not possible. As we need the optimal amount of testing based
on the risk assessment of the application.
(iv) All the tests to be conducted should be planned before implementing it
(v) It follows the Pareto rule(80/20 rule) which states that 80% of errors come from 20%
of program components.
(vi) Start testing with small parts and extend it to large parts.
Types of Testing:-
1. Unit Testing

It focuses on the smallest unit of software design. In this, we test an individual unit or
group of interrelated units. It is often done by the programmer by using sample input
and observing its corresponding outputs.

Example:

a) In a program we are checking if the loop, method, or


function is working fine
b) Misunderstood or incorrect, arithmetic precedence.
c) Incorrect initialization
2. Integration Testing

The objective is to take unit-tested components and build a program structure that has
been dictated by design. Integration testing is testing in which a group of components is
combined to produce output.

Integration testing is of four types: (i) Top-down (ii) Bottom-up (iii) Sandwich (iv) Big-
Bang
Example:

(a) Black Box testing:- It is used for validation.


In this, we ignore internal working mechanisms and
focus on what is the output?.

(b) White box testing:- It is used for verification.


In this, we focus on internal mechanisms i.e.
how the output is achieved?
3. Regression Testing

Every time a new module is added leads to changes in the program. This type of testing
makes sure that the whole component works properly even after adding components to
the complete program.
Example

In school, record suppose we have module staff, students


and finance combining these modules and checking if on
integration of these modules works fine in regression testing
4. Smoke Testing

This test is done to make sure that the software under testing is ready or stable for
further testing
It is called a smoke test as the testing of an initial pass is done to check if it did not
catch the fire or smoke in the initial switch on.
Example:

If the project has 2 modules so before going to the module


make sure that module 1 works properly
5. Alpha Testing

This is a type of validation testing. It is a type of acceptance testing which is done


before the product is released to customers. It is typically done by QA people.
Example:

When software testing is performed internally within


the organization
6. Beta Testing

The beta test is conducted at one or more customer sites by the end-user of the
software. This version is released for a limited number of users for testing in a real-time
environment
Example:

When software testing is performed for the limited


number of people
7. System Testing

This software is tested such that it works fine for the different operating systems. It is
covered under the black box testing technique. In this, we just focus on the required
input and output without focusing on internal working.
In this, we have security testing, recovery testing, stress testing, and performance
testing
Example:

This includes functional as well as nonfunctional


testing
8. Stress Testing

In this, we give unfavorable conditions to the system and check how they perform in
those conditions.
Example:
(a) Test cases that require maximum memory or other
resources are executed
(b) Test cases that may cause thrashing in a virtual
operating system
(c) Test cases that may cause excessive disk requirement
9. Performance Testing

It is designed to test the run-time performance of software within the context of an


integrated system. It is used to test the speed and effectiveness of the program. It is
also called load testing. In it we check, what is the performance of the system in the
given load.
Example:

Checking several processor cycles.


10. Object-Oriented Testing

This testing is a combination of various testing techniques that help to verify and
validate object-oriented software. This testing is done in the following manner:

 Testing of Requirements,
 Design and Analysis of Testing,
 Testing of Code,
 Integration testing,
 System testing,
 User Testing.

11. Acceptance Testing

Acceptance testing is done by the customers to check whether the delivered products
perform the desired tasks or not, as stated in requirements.

We use this OOT, for discussing test plans and for executing the projects.

Strategic planning
What is strategic planning?
Strategic planning is a process in which an organization's leaders define their
vision for the future and identify their organization's goals and objectives. The
process includes establishing the sequence in which those goals should be
realized so that the organization can reach its stated vision.

Strategic planning typically represents mid- to long-term goals with a life span
of three to five years, though it can go longer. This is different than business
planning, which typically focuses on short-term, tactical goals, such as how a
budget is divided up. The time covered by a business plan can range from
several months to several years.

The product of strategic planning is a strategic plan. It is often reflected in a


plan document or other media. These plans can be easily shared, understood
and followed by various people including employees, customers, business
partners and investors.

Organizations conduct strategic planning periodically to consider the effect


of changing business, industry, legal and regulatory conditions. A strategic
plan may be updated and revised at that time to reflect any strategic changes.
Why is strategic planning important?
Businesses need direction and organizational goals to work toward. Strategic
planning offers that type of guidance. Essentially, a strategic plan is a
roadmap to get to business goals. Without such guidance, there is no way to
tell whether a business is on track to reach its goals.

The following four aspects of strategy development are worth attention:

1. The mission. Strategic planning starts with a mission that offers a


company a sense of purpose and direction. The organization's mission
statement describes who it is, what it does and where it wants to go.
Missions are typically broad but actionable. For example, a business in the
education industry might seek to be a leader in online virtual educational
tools and services.

2. The goals. Strategic planning involves selecting goals. Most planning


uses SMART goals -- specific, measurable, achievable, realistic and time-
bound -- or other objectively measurable goals. Measurable goals are
important because they enable business leaders to determine how well the
business is performing against goals and the overall mission. Goal setting
for the fictitious educational business might include releasing the first
version of a virtual classroom platform within two years or increasing sales
of an existing tool by 30% in the next year.

3. Alignment with short-term goals. Strategic planning relates directly to


short-term, tactical business planning and can help business leaders with
everyday decision-making that better aligns with business strategy. For the
fictitious educational business, leaders might choose to make strategic
investments in communication and collaboration technologies, such
as virtual classroom software and services but decline opportunities to
establish physical classroom facilities.

4. Evaluation and revision. Strategic planning helps business leaders


periodically evaluate progress against the plan and make changes or
adjustments in response to changing conditions. For example, a business
may seek a global presence, but legal and regulatory restrictions could
emerge that affect its ability to operate in certain geographic regions. As
result, business leaders might have to revise the strategic plan to redefine
objectives or change progress metrics.
What are the steps in the strategic planning process?
There are myriad different ways to approach strategic planning depending on
the type of business and the granularity required. Most strategic planning
cycles can be summarized in these five steps:
Identify. A strategic planning cycle starts with the determination of a
business's current strategic position. This is where stakeholders use the
existing strategic plan -- including the mission statement and long-term
strategic goals -- to perform assessments of the business and its environment.
These assessments can include a needs assessment or a SWOT (strengths,
weaknesses, opportunities and threats) analysis to understand the state of the
business and the path ahead.

Prioritize. Next, strategic planners set objectives and initiatives that line up
with the company mission and goals and will move the business toward
achieving its goals. There may be many potential goals, so planning prioritizes
the most important, relevant and urgent ones. Goals may include a
consideration of resource requirements -- such as budgets and equipment --
and they often involve a timeline and business metrics or KPIs for measuring
progress.

Develop. This is the main thrust of strategic planning in which stakeholders


collaborate to formulate the steps or tactics necessary to attain a stated
strategic objective. This may involve creating numerous short-term tactical
business plans that fit into the overarching strategy. Stakeholders involved in
plan development use various tools such as a strategy map to help visualize
and tweak the plan. Developing the plan may involve cost and opportunity
tradeoffs that reflect business priorities. Developers may reject some
initiatives if they don't support the long-term strategy.

Implement. Once the strategic plan is developed, it's time to put it in motion.
This requires clear communication across the organization to set
responsibilities, make investments, adjust policies and processes, and
establish measurement and reporting. Implementation typically
includes strategic management with regular strategic reviews to ensure that
plans stay on track.
Update. A strategic plan is periodically reviewed and revised to adjust
priorities and reevaluate goals as business conditions change and new
opportunities emerge. Quick reviews of metrics can happen quarterly, and
adjustments to the strategic plan can occur annually. Stakeholders may
use balanced scorecards and other tools to assess performance against
goals.

Who does the strategic planning in a business?


A committee typically leads the strategic planning process. Planning experts
recommend the committee include representatives from all areas within the
enterprise and work in an open and transparent way where information is
documented from start to finish.

The committee researches and gathers the information needed to understand


the organization's current status and factors that will affect it in the future. The
committee should solicit input and feedback to validate or challenge its
assessment of the information.
The committee can opt to use one of many methodologies or
strategic frameworks that have been developed to guide leaders through this
process. These methodologies take the committee through a series of steps
that include an analysis or assessment, strategy formulation, and the
articulation and communication of the actions needed to move the
organization toward its strategic vision.

The committee creates benchmarks that will enable the organization to


determine how well it is performing against its goals as it implements the
strategic plan. The planning process should also identify which executives are
accountable for ensuring that benchmarking activities take place at planned
times and that specific objectives are met.

How often should strategic planning be done?


There are no uniform requirements to dictate the frequency of a strategic
planning cycle. However, there are common approaches.

 Quarterly reviews. Once a quarter is usually a convenient time frame to


revisit assumptions made in the planning process and gauge progress by
checking metrics against the plan.

 Annual reviews. A yearly review lets business leaders assess metrics for
the previous four quarters and make informed adjustments to the plan.

Timetables are always subject to change. Timing should be flexible and


tailored to the needs of a company. For example, a startup in a dynamic
industry might revisit its strategic plan monthly. A mature business in a well-
established industry might opt to revisit the plan less frequently.

Types of strategic plans


Strategic planning activities typically focus on three areas: business, corporate
or functional. They break out as follows:
 Business. A business-centric strategic plan focuses on the competitive
aspects of the organization -- creating competitive advantages and
opportunities for growth. These plans adopt a mission evaluating the
external business environment, setting goals, and allocating financial,
human and technological resources to meet those goals. This is the typical
strategic plan and the main focus of this article.

 Corporate. A corporate-centric plan defines how the company works. It


focuses on organizing and aligning the structure of the business, its
policies and processes and its senior leadership to meet desired goals. For
example, the management of a research and
development skunkworks might be structured to function dynamically and
on an ad hoc basis. It would look different from the management team in
finance or HR.

 Functional. Function-centric strategic plans fit within corporate-level


strategies and provide a granular examination of specific departments or
segments such as marketing, HR, finance and development. Functional
plans focus on policy and process -- such as security and compliance --
while setting budgets and resource allocations.

In most cases, a strategic plan will involve elements of all three focus areas.
But the plan may lean toward one focus area depending on the needs and
type of business

What is strategic management?


Organizations that are best at aligning their actions with their strategic plans
engage in strategic management. A strategic management process
establishes ongoing practices to ensure that an organization's processes and
resources support the strategic plan's mission and vision statement.

In simple terms, strategic management is the implementation of the strategy.


As such, strategic management is sometimes referred to as strategy
execution. Strategy execution involves identifying benchmarks, allocating
financial and human resources and providing leadership to realize established
goals.

Strategic management may involve a prescriptive or descriptive approach. A


prescriptive approach focuses on how strategies should be created. It often
uses an analytical approach -- such as SWOT or balanced scorecards -- to
account for risks and opportunities. A descriptive approach focuses on how
strategies should be implemented and typically relies on general guidelines or
principles.

Given the similarities between strategic planning and strategic management,


the two terms are sometimes used interchangeably.

What is a strategy map?


A strategy map is a planning tool or template used to help stakeholders
visualize the complete strategy of a business as one interrelated graphic.
These visualizations offer a powerful way for understanding and reviewing the
cause-and-effect relationships among the elements of a business strategy.

While a map can be drawn in a number of ways, all strategy maps focus on
four major business areas or categories: financial, customer, internal business
processes (IBPs), and learning and growth. Goals sort into those four areas,
and relationships or dependencies among those goals can be established.

For example, a strategy map might include a financial goal of reducing costs
and an IBP goal to improve operational efficiency. These two goals are related
and can help stakeholders understand that tasks such as improving
operational workflows can reduce company costs and meet two elements of
the strategic plan.
A strategy map can help translate overarching goals into an action plan and
goals that can be aligned and implemented.

Strategy mapping can also help to identify strategic challenges that might not
be obvious. For example, one learning and growth goal may be to increase
employee expertise but that may expose unexpected challenges in employee
retention and compensation, which affects cost reduction goals.

Benefits of strategic planning


Effective strategic planning has many benefits. It forces organizations to be
aware of the future state of opportunities and challenges. It also forces them
to anticipate risks and understand what resources will be needed to seize
opportunities and overcome strategic issues.

Strategic planning also gives individuals a sense of direction and marshals


them around a common mission. It creates standards and accountability.
Strategic planning can enhance operational plans and efficiency. It also helps
organizations limit time spent on crisis management, where they're reacting to
unexpected changes that they failed to anticipate and prepare for.

SWOT Analysis

SWOT stands for strengths, weaknesses, opportunities and threats. SWOT


analysis is a widely used framework to summaries a company’s situation or
current position.
Any company undertaking strategic planning will have to carry out SWOT
analysis: establishing its current position in the light of its strengths,
weaknesses, opportunities and threats.
Environmental and industry analyses provide the information needed to
identify opportunities and threats, while internal analysis provides information
needed to identify strengths and weaknesses. These are the fundamental
areas of focus in SWOT analysis.
SWOT analysis stands at the core of strategic management. It is important to
note that strengths and weaknesses are intrinsic (potential) value-creating
skills or assets or the lack thereof, relative to competitive forces.
Opportunities and threats, however, are external factors that are not created
by the company but emerge as a result of the competitive dynamics caused
by ‘gaps’ or ‘crunches’ in the market.
We had briefly mentioned about the meaning of the terms opportunities,
threats, strengths and weaknesses. We revisit the same for purposes of
SWOT analysis.

Strengths
Strength is something a company possesses or is good at doing.
Examples include a skill, valuable assets, alliances or cooperative ventures,
experienced sales force, easy access to raw materials, brand reputation etc.
Strengths are not a growing market, new products, etc.
Weaknesses
A weakness is something a company lacks or does poorly.
Examples include lack of skills or expertise, deficiencies in assets, inferior
capabilities in functional areas etc. Though weaknesses are often seen as the
logical ‘inverse’ of the company’s threats, the company’s lack of strength in a
particular area or market is not necessarily a relative weakness because
competitors may also lack this particular strength.

Opportunities
An opportunity is a major favorable situation in a firm’s environment.
Examples include market growth, favorable changes in the competitive or
regulatory framework, technological developments or demographic changes,
increase in demand, opportunity to introduce products in new markets, turning
R&D into cash by licensing or selling patents etc.
The level of detail and perceived degree of realism determine the extent of
opportunity analysis.

Threats
A threat is a major unfavorable situation in a firm’s environment.
Examples include an increase in competition; slow market growth, increased
power of buyers or suppliers, changes in regulations etc.
These forces pose serious threats to a company because they may cause
lower sales, higher cost of operations, higher cost of capital, inability to make
break-even, shrinking margins or profitability etc. Your competitor’s
opportunity may well be a threat to you.
Carrying out SWOT Analysis

The first thing that a SWOT analysis does is to evaluate the strengths and
weaknesses in terms of skills, resources and competencies. The analyst then
should see whether the internal capabilities match with the demands of the
key success factors.
The job of a strategist is to capitalize on the organization’s strengths while
minimizing the effects of its weaknesses in order to take advantage of
opportunities and overcome threats in the environment. SWOT analysis for a
typical firm is given below

Steps in SWOT Analysis


The three important steps in SWOT analysis are:
Identification
Conclusion
Translation

Identification
Identify company resource strengths and competitive capabilities
Identify company resource weaknesses and competitive deficiencies
Identify the company’s opportunities
Identify external threats

Conclusion
Draw conclusions about the company’s overall situation.

Translation
Translate the conclusions into strategic actions by acting on them:
Match the company’s strategy to its strengths and opportunities
Correct important weaknesses
Defend against external threats
In devising a SWOT analysis, there are several factors that will enhance the
quality of the material:
Keep it brief, pages of analysis are usually not required.
Relate strengths and weaknesses, wherever possible, to industry key factors
for success.
Strengths and weaknesses should also be stated in competitive terms, that is,
in comparison with competitors.
Statements should be specific and avoid blandness.
Analysis should reflect the gap, that is, where the company wishes to be and
where it is now.
It is important to be realistic about the strengths and weaknesses of one’s own
and competitive organizations.
Probably the biggest mistake that is commonly made in SWOT analysis is to
provide a long list of points but little logic, argument and evidence. A short list
with each point well-argued is more likely to be convincing.

Four sets of emerging strategies


SO Strategies
SO strategies are generated by thinking of ways in which a company can use
its strengths to take advantage of opportunities. This is the most desirable and
advantageous strategy as it seeks to mass up the firm’s strengths to exploit
opportunities
For example, Hindustan Lever has been augmenting its strengths by taking
over businesses in the food industry, to exploit the growing potential of the
food business.

ST Strategies
ST strategies use a company’s strengths as a way to avoid threats. A
company may use its technological, financial and marketing strengths to
combat a new competition.
For example, Hindustan Lever has been employing this strategy to fight the
increasing competition from companies like Nirma, Procter & Gamble etc.

WO Strategies
WO Strategies attempt to take advantage of opportunities by overcoming its
weaknesses.
For example, for textile machinery manufacturers in India the main weakness
was dependence on foreign firms for technology and the long-time taken to
execute an order. The strategy followed was the thrust given to R&D to
develop indigenous technology so as to be in a better position to exploit the
opportunity of growing demand for textile machinery.
WT Strategies
WT Strategies are basically defensive strategies and primarily aimed at
minimizing weaknesses and avoiding threats.
For example, managerial weakness may be solved by a change of
managerial personnel, training and development etc. Weakness due to
excess manpower may be addressed by restructuring, downsizing, delayering
and voluntary retirement schemes.
External threats may be met by joint ventures and other types of strategic
alliances. In some cases, an unprofitable business that cannot be revived may
be divested. Strategies which utilize a strength to take advantage of an
opportunity are generally referred toas “exploitative” or “developmental
strategies”.
Strategies which use a strength to eliminate a weakness may be referred to
as “blocking strategies”. Strategies which overcome a weakness to take
advantage of an opportunity or eliminate a threat may be referred to as
“remedial strategies”.

Critical Assessment of SWOT Analysis


SWOT analysis is one of the most basic techniques for analyzing firm and
industry conditions. It provides the “raw material” for analyzing internal
conditions as well as external conditions of a firm. SWOT analysis can be
used in many ways to aid strategic analysis.
For example, it can be used for a systematic discussion of a firm’s resources
and basic alternatives that emerge from such an analysis. Such a discussion
is necessary because a strength to one firm may be a weakness for another
firm, and vice-versa.
For example, increased health consciousness of people is a threat to some
firms (e.g. tobacco) while it is an opportunity to others (e.g. health clubs).
According to Johnson and Sholes (2002), a SWOT analysis summarizes the
key issues from the business environment and the strategic capability of an
organization that impacts strategy development.
This can also be useful as a basis for judging future courses of action. The
aim is to identify the extent to which the current strengths and weaknesses
are relevant to, and capable of, dealing with the changes taking place in the
business environment.
It can also be used to assess whether there are opportunities to exploit further
the unique resources or core competencies of the organization. Overall,
SWOT analysis helps focus discussion on future choices and the extent to
which the company is capable of supporting its strategies.

Advantages of SWOT Analysis


 It is simple.

 It portrays the essence of strategy formulation: matching a firm’s internal


strengths and weaknesses with its external opportunities and threats.

 Together with other techniques like Value Chain Analysis and RBV,
SWOT analysis improves the quality of the internal analysis.

Limitations of SWOT analysis


 It gives a static perspective and does not reveal the dynamics of
competitive environment.

 SWOT emphasizes a single dimension of strategy (i.e. strength or


weakness) and ignores other factors needed for competitive success.

 A firm’s strengths do not necessarily help the firm create value or


competitive advantage.

 SWOT’s focus on the external environment is too narrow.


In spite of the above criticism and its limitations, SWOT analysis is still a
popular analytical tool used by most organizations. It is definitely a useful aid
in generating alternative strategies, through what is called a TOWS matrix.

Software Engineering | Architectural design


Introduction: The software needs the architectural design to
represents the design of software. IEEE defines architectural
design as “the process of defining a collection of hardware and
software components and their interfaces to establish the
framework for the development of a computer system.” The
software that is built for computer-based systems can exhibit one
of these many architectural styles.
Each style will describe a system category that consists of :

 A set of components(eg: a database, computational modules)


that will perform a function required by the system.
 The set of connectors will help in coordination, communication,
and cooperation between the components.
 Conditions that how components can be integrated to form the
system.
 Semantic models that help the designer to understand the
overall properties of the system.
The use of architectural styles is to establish a structure for all the
components of the system.
Taxonomy of Architectural styles:
1. Data centered architectures:
 A data store will reside at the center of this architecture and
is accessed frequently by the other components that update,
add, delete or modify the data present within the store.
 The figure illustrates a typical data centered style. The client
software access a central repository. Variation of this
approach are used to transform the repository into a
blackboard when data related to client or data of interest for
the client change the notifications to client software.
 This data-centered architecture will promote integrability.
This means that the existing components can be changed
and new client components can be added to the architecture
without the permission or concern of other clients.
 Data can be passed among clients using blackboard
mechanism.
Advantage of Data centered architecture
 Repository of data is independent of clients
 Client work independent of each other
 It may be simple to add additional clients.
 Modification can be very easy

Data centered architecture

1. Data flow architectures:


 This kind of architecture is used when input data to be
transformed into output data through a series of
computational manipulative components.
 The figure represents pipe-and-filter architecture since it
uses both pipe and filter and it has a set of components
called filters connected by pipes.
 Pipes are used to transmit data from one component to the
next.
 Each filter will work independently and is designed to take
data input of a certain form and produces data output to the
next filter of a specified form. The filters don’t require any
knowledge of the working of neighboring filters.
 If the data flow degenerates into a single line of transforms,
then it is termed as batch sequential. This structure accepts
the batch of data and then applies a series of sequential
components to transform it.
Advantage of Data Flow architecture
 It encourages upkeep, repurposing, and modification.
 With this design, concurrent execution is supported.
Disadvantage of Data Flow architecture
 It frequently degenerates to batch sequential system
 Data flow architecture does not allow applications that require
greater user engagement.
 It is not easy to coordinate two different but related streams

Data Flow architecture

1. Call and Return architectures: It is used to create a program


that is easy to scale and modify. Many sub-styles exist within
this category. Two of them are explained below.
 Remote procedure call architecture: This components is
used to present in a main program or sub program
architecture distributed among multiple computers on a
network.
 Main program or Subprogram architectures: The main
program structure decomposes into number of subprograms
or function into a control hierarchy. Main program contains
number of subprograms that can invoke other components.
1. Object Oriented architecture: The components of a system
encapsulate data and the operations that must be applied to
manipulate the data. The coordination and communication
between the components are established via the message
passing.
Characteristics of Object Oriented architecture
 Object protect the system’s integrity.
 An object is unaware of the depiction of other items.
Advantage of Object Oriented architecture
 It enables the designer to separate a challenge into a collection
of autonomous objects.
 Other objects are aware of the implementation details of the
object, allowing changes to be made without having an impact
on other objects.
2. Layered architecture:

 A number of different layers are defined with each layer


performing a well-defined set of operations. Each layer will do
some operations that becomes closer to machine instruction
set progressively.
 At the outer layer, components will receive the user interface
operations and at the inner layers, components will perform the
operating system interfacing(communication and coordination
with OS)
 Intermediate layers to utility services and application software
functions.
 One common example of this architectural style is OSI-ISO
(Open Systems Interconnection-International Organisation for
Standardisation) communication system.

Layered architecture:

Software Engineering | User Interface


Design
User interface is the front-end application view to which user
interacts in order to use the software. The software becomes
more popular if its user interface is:
 Attractive
 Simple to use
 Responsive in short time
 Clear to understand
 Consistent on all interface screens
There are two types of User Interface:
1. Command Line Interface: Command Line Interface provides a
command prompt, where the user types the command and
feeds to the system. The user needs to remember the syntax of
the command and its use.
2. Graphical User Interface: Graphical User Interface provides
the simple interactive interface to interact with the system. GUI
can be a combination of both hardware and software. Using
GUI, user interprets the software.
User Interface Design Process:

The analysis and design process of a user interface is iterative


and can be represented by a spiral model. The analysis and
design process of user interface consists of four framework
activities.
1. User, task, environmental analysis, and modeling: Initially,
the focus is based on the profile of users who will interact with
the system, i.e. understanding, skill and knowledge, type of
user, etc, based on the user’s profile users are made into
categories. From each category requirements are gathered.
Based on the requirements developer understand how to
develop the interface. Once all the requirements are gathered a
detailed analysis is conducted. In the analysis part, the tasks
that the user performs to establish the goals of the system are
identified, described and elaborated. The analysis of the user
environment focuses on the physical work environment. Among
the questions to be asked are:
 Where will the interface be located physically?
 Will the user be sitting, standing, or performing other tasks
unrelated to the interface?
 Does the interface hardware accommodate space, light, or
noise constraints?
 Are there special human factors considerations driven by
environmental factors?
2. Interface Design: The goal of this phase is to define the set of
interface objects and actions i.e. Control mechanisms that
enable the user to perform desired tasks. Indicate how these
control mechanisms affect the system. Specify the action
sequence of tasks and subtasks, also called a user scenario.
Indicate the state of the system when the user performs a
particular task. Always follow the three golden rules stated by
Theo Mandel. Design issues such as response time, command
and action structure, error handling, and help facilities are
considered as the design model is refined. This phase serves
as the foundation for the implementation phase.
3. Interface construction and implementation: The
implementation activity begins with the creation of prototype
(model) that enables usage scenarios to be evaluated. As
iterative design process continues a User Interface toolkit that
allows the creation of windows, menus, device interaction, error
messages, commands, and many other elements of an
interactive environment can be used for completing the
construction of an interface.
4. Interface Validation: This phase focuses on testing the
interface. The interface should be in such a way that it should
be able to perform tasks correctly and it should be able to
handle a variety of tasks. It should achieve all the user’s
requirements. It should be easy to use and easy to learn. Users
should accept the interface as a useful one in their work.
Golden Rules:
The following are the golden rules stated by Theo Mandel that
must be followed during the design of the interface. Place the
user in control:

 Define the interaction modes in such a way that does not force
the user into unnecessary or undesired actions: The user
should be able to easily enter and exit the mode with little or no
effort.
 Provide for flexible interaction: Different people will use
different interaction mechanisms, some might use keyboard
commands, some might use mouse, some might use touch
screen, etc, Hence all interaction mechanisms should be
provided.
 Allow user interaction to be interruptible and undoable: When a
user is doing a sequence of actions the user must be able to
interrupt the sequence to do some other work without losing
the work that had been done. The user should also be able to
do undo operation.
 Streamline interaction as skill level advances and allow the
interaction to be customized: Advanced or highly skilled user
should be provided a chance to customize the interface as user
wants which allows different interaction mechanisms so that
user doesn’t feel bored while using the same interaction
mechanism.
 Hide technical internals from casual users: The user should not
be aware of the internal technical details of the system. He
should interact with the interface just to do his work.
 Design for direct interaction with objects that appear on screen:
The user should be able to use the objects and manipulate the
objects that are present on the screen to perform a necessary
task. By this, the user feels easy to control over the screen.
Reduce the user’s memory load:
 Reduce demand on short-term memory: When users are
involved in some complex tasks the demand on short-term
memory is significant. So the interface should be designed in
such a way to reduce the remembering of previously done
actions, given inputs and results.
 Establish meaningful defaults: Always initial set of defaults
should be provided to the average user, if a user needs to add
some new features then he should be able to add the required
features.
 Define shortcuts that are intuitive: Mnemonics should be used
by the user. Mnemonics means the keyboard shortcuts to do
some action on the screen.
 The visual layout of the interface should be based on a real-
world metaphor: Anything you represent on a screen if it is a
metaphor for real-world entity then users would easily
understand.
 Disclose information in a progressive fashion: The interface
should be organized hierarchically i.e. on the main screen the
information about the task, an object or some behavior should
be presented first at a high level of abstraction. More detail
should be presented after the user indicates interest with a
mouse pick.
Make the interface consistent:
 Allow the user to put the current task into a meaningful context:
Many interfaces have dozens of screens. So it is important to
provide indicators consistently so that the user know about the
doing work. The user should also know from which page has
navigated to the current page and from the current page where
can navigate.
 Maintain consistency across a family of applications: The
development of some set of applications all should follow and
implement the same design, rules so that consistency is
maintained among applications.
 If past interactive models have created user expectations do
not make changes unless there is a compelling reason.

Generic process model

The generic process model is an abstraction of the software development process. It is used in

most software since it provides a base for them.

The generic process model encompasses the following five steps:

1. Communication

2. Planning

3. Modelling

4. Construction

5. Deployment

Communication

In this step, we communicate with the clients and end-users.

 We discuss the requirements of the project with the users.

 The users give suggestions on the project. If any changes are difficult to implement, we work

on alternative ideas.

Planning

In this step, we plan the steps for project development. After completing the final discussion, we

report on the project.


 Planning plays a key role in the software development process.

 We discuss the risks involved in the project.

Modelling

In this step, we create a model to understand the project in the real world. We showcase the model

to all the developers. If changes are required, we implement them in this step.

 We develop a practical model to get a better understanding of the project.

Construction

In this step, we follow a procedure to develop the final product.

 If any code is required for the project development, we implement it in this phase.

 We also test the project in this phase.

Deployment

In this phase, we submit the project to the clients for their feedback and add any missing

requirements.

 We get the client feedback.

 Depending on the feedback form, we make the appropriate changes.

Software Maintenance
Software maintenance is widely accepted part of SDLC now a days. It stands for all the

modifications and updations done after the delivery of software product. There are number of

reasons, why modifications are required, some of them are briefly mentioned below:

 Market Conditions - Policies, which changes over the time, such as taxation and newly

introduced constraints like, how to maintain bookkeeping, may trigger need for modification.

 Client Requirements - Over the time, customer may ask for new features or functions in the

software.

 Host Modifications - If any of the hardware and/or platform (such as operating system) of

the target host changes, software changes are needed to keep adaptability.

 Organization Changes - If there is any business level change at client end, such as

reduction of organization strength, acquiring another company, organization venturing into

new business, need to modify in the original software may arise.

Types of maintenance

In a software lifetime, type of maintenance may vary based on its nature. It may be just a routine

maintenance tasks as some bug discovered by some user or it may be a large event in itself based

on maintenance size or nature. Following are some types of maintenance based on their

characteristics:h

 Corrective Maintenance - This includes modifications and updations done in order to correct

or fix problems, which are either discovered by user or concluded by user error reports.

 Adaptive Maintenance - This includes modifications and updations applied to keep the

software product up-to date and tuned to the ever changing world of technology and business

environment.

 Perfective Maintenance - This includes modifications and updates done in order to keep the

software usable over long period of time. It includes new features, new user requirements for

refining the software and improve its reliability and performance.


 Preventive Maintenance - This includes modifications and updations to prevent future

problems of the software. It aims to attend problems, which are not significant at this moment

but may cause serious issues in future.

Cost of Maintenance

Reports suggest that the cost of maintenance is high. A study on estimating software maintenance

found that the cost of maintenance is as high as 67% of the cost of entire software process cycle.

Real-world factors affecting Maintenance Cost

 The standard age of any software is considered up to 10 to 15 years.

 Older softwares, which were meant to work on slow machines with less memory and storage

capacity cannot keep themselves challenging against newly coming enhanced softwares on

modern hardware.

 As technology advances, it becomes costly to maintain old software.

 Most maintenance engineers are newbie and use trial and error method to rectify problem.

 Often, changes made can easily hurt the original structure of the software, making it hard for

any subsequent changes.

 Changes are often left undocumented which may cause more conflicts in future.

Software-end factors affecting Maintenance Cost

 Structure of Software Program

 Programming Language

 Dependence on external environment

 Staff reliability and availability

Maintenance Activities

IEEE provides a framework for sequential maintenance process activities. It can be used in iterative

manner and can be extended so that customized items and processes can be included.
These activities go hand-in-hand with each of the following phase:

 Identification & Tracing - It involves activities pertaining to identification of requirement of

modification or maintenance. It is generated by user or system may itself report via logs or

error messages.Here, the maintenance type is classified also.

 Analysis - The modification is analyzed for its impact on the system including safety and

security implications. If probable impact is severe, alternative solution is looked for. A set of

required modifications is then materialized into requirement specifications. The cost of

modification/maintenance is analyzed and estimation is concluded.

 Design - New modules, which need to be replaced or modified, are designed against

requirement specifications set in the previous stage. Test cases are created for validation

and verification.

 Implementation - The new modules are coded with the help of structured design created in

the design step.every programmer is expected to do unit testing in parallel.

 System Testing - Integration testing is done among newly created modules. Integration

testing is also carried out between new modules and the system. Finally the system is tested

as a whole, following regressive testing procedures.

 Acceptance Testing - After testing the system internally, it is tested for acceptance with the

help of users. If at this state, user complaints some issues they are addressed or noted to

address in next iteration.

 Delivery - After acceptance test, the system is deployed all over the organization either by

small update package or fresh installation of the system. The final testing takes place at client

end after the software is delivered.

Training facility is provided if required, in addition to the hard copy of user manual.

 Maintenance management - Configuration management is an essential part of system

maintenance. It is aided with version control tools to control versions, semi-version or patch

management
.

Software Re-engineering

When we need to update the software to keep it to the current market, without impacting its

functionality, it is called software re-engineering. It is a thorough process where the design of

software is changed and programs are re-written.

Legacy software cannot keep tuning with the latest technology available in the market. As the

hardware become obsolete, updating of software becomes a headache. Even if software grows old

with time, its functionality does not.

For example, initially Unix was developed in assembly language. When language C came into

existence, Unix was re-engineered in C, because working in assembly language was difficult.

Other than this, sometimes programmers notice that few parts of software need more maintenance

than others and they also need re-engineering.

Re-Engineering Process

 Decide what to re-engineer. Is it whole software or a part of it?


 Perform Reverse Engineering, in order to obtain specifications of existing software.

 Restructure Program if required. For example, changing function-oriented programs into

object-oriented programs.

 Re-structure data as required.

 Apply Forward engineering concepts in order to get re-engineered software.

There are few important terms used in Software re-engineering

Reverse Engineering

It is a process to achieve system specification by thoroughly analyzing, understanding the existing

system. This process can be seen as reverse SDLC model, i.e. we try to get higher abstraction level

by analyzing lower abstraction levels.

An existing system is previously implemented design, about which we know nothing. Designers then

do reverse engineering by looking at the code and try to get the design. With design in hand, they try

to conclude the specifications. Thus, going in reverse from code to system specification.

Program Restructuring

It is a process to re-structure and re-construct the existing software. It is all about re-arranging the

source code, either in same programming language or from one programming language to a

different one. Restructuring can have either source code-restructuring and data-restructuring or both.

Re-structuring does not impact the functionality of the software but enhance reliability and

maintainability. Program components, which cause errors very frequently can be changed, or

updated with re-structuring.

The dependability of software on obsolete hardware platform can be removed via re-structuring.
Forward Engineering

Forward engineering is a process of obtaining desired software from the specifications in hand which

were brought down by means of reverse engineering. It assumes that there was some software

engineering already done in the past.

Forward engineering is same as software engineering process with only one difference – it is carried

out always after reverse engineering.

Reuse Process

Two kinds of method can be adopted: either by keeping requirements same and adjusting

components or by keeping components same and modifying requirements.

 Requirement Specification - The functional and non-functional requirements are specified,

which a software product must comply to, with the help of existing system, user input or both.
 Design - This is also a standard SDLC process step, where requirements are defined in

terms of software parlance. Basic architecture of system as a whole and its sub-systems are

created.

 Specify Components - By studying the software design, the designers segregate the entire

system into smaller components or sub-systems. One complete software design turns into a

collection of a huge set of components working together.

 Search Suitable Components - The software component repository is referred by designers

to search for the matching component, on the basis of functionality and intended software

requirements..

 Incorporate Components - All matched components are packed together to shape them as

complete software.

The Nature of WebApps:

The unique nature of WebApps:

In the early days of the World Wide Web, websites were just a set of linked hypertext files which

presented information using text and limited graphics. The augmentation of HTML by development

tools like Java, XML enabled web engineers to provide computing capability along with informational

content. Web-based systems and applications (WebApps) are sophisticated tools that not only

present stand-alone information but also integrate databases and business applications. Web-based

systems and applications “involve a mixture between print publishing and software development,

between marketing and computing, between internal communications and external relations, and

between art and technology.”

The following are the common attributes for WebApps:


● Network intensiveness: A WebApp resides on a network (Internet or Intranet) and must serve the

needs of a diverse community of clients.

● Concurrency: A large number of users may access the WebApp at one time.

● Unpredictable load. The number of users of the WebApp may vary by orders of magnitude from

day to day.

● Performance: If a WebApp user must wait too long, he or she may decide to go elsewhere.

● Availability: Although the expectation of 100 percent availability is unreasonable, users of popular

WebApps often demand access on a 24/7/365 basis.

● Data driven: The primary function of many WebApps is to use hypermedia to present text,

graphics, audio, and video content to the end user

● Content sensitive: The quality and aesthetic nature of content remains an important determinant

of the quality of a WebApp.

● Continuous evolution: Unlike conventional application software that evolves over a series of

planned, chronologically spaced releases, Web applications evolve continuously.

● Immediacy: WebApps often exhibit a time-to-market that can be a matter of a few days or weeks.

● Security: Because WebApps are available via network access sensitive content must be

protected and secure modes of data transmission must be provided.

● Aesthetics: An undeniable part of the appeal of a WebApp is its look and feel

You might also like