You are on page 1of 12

1.

SOFTWARE DESIGN CHALLENGES


REQUIREMENTS VOLATILITY

Requirements volatility is a major reason for the complexity of software projects.


The common view is that software can be modified easily
Requirements are not fully specified before design and construction
Requirements are changed after design and construction
In software projects, although much effort is put into the requirements phase to ensure that
requirements are complete and consistent, that is rarely the case.
This makes the software design phase the most influential one when it comes to minimizing the
effects of new or changing requirements.
This forces designers to create designs that provide solutions to problems at a given state while
also anticipating changes and accommodating them with minimal effort.
Because of requirements volatility, software engineers must have a strong understanding of the
principles of software design and develop skills to manage complexity and change in software
projects.

INCONSISTENT DEVELOPMENT PROCESS

Software engineering is a process-oriented field. In the design phase, processes involve a set of
activities and tasks required to bridge the gap between requirements and construction. For
example:
Architectural and detailed designs
Design reviews
Establishing quality evaluation criteria
Establishing design change management and version control
Adopting design tools
The problem is that in many cases, a companys design process
is not well established,
is poorly understood,
is approached with minimalistic expectations,
is focused on one form of design, e.g., user interface, while ignoring others, or
is simply, not done at all!
FAST AND EVER-CHANGING TECHNOLOGY

The technology for designing and implementing today software systems continues to evolve to
provide improved capabilities. For example:
Modeling languages and tools
Programming languages
Integrated development environments
Design strategies, patterns, etc.
As new technologies emerge, software designers are required to assimilate and
employ them all at the same time. In some cases, old and new technology needs
to coexist in the same project!
This creates a demand for capable designers that can assimilate new concepts and technology
quickly and effectively. This is challenging because of the time required for both learning new
technology and completing a project on-time, while making sure that the new technology
interoperates well with old legacy systems.

ETHICAL AND PROFESSIONAL PRACTICES

Designers create blueprints that drive the construction of software


Tight schedules can create external pressures to deviate from the formal design process to get
the product out the door.
In some cases, this can have catastrophic consequences
To correctly and responsibly execute the design phase, designers are required to exert strong
leadership skills to:
Influence and negotiate with stakeholders
Motivate the development team
Enforcing ethical guidelines
Evaluate the social impacts of their designs in the public domain or in
safety critical systems
Follow and enforce the ethical and professional practices
This is a challenge when attempting under numerous pressures from different
stakeholders, e.g., management, customer, peers, etc.
MANAGING DESIGN INFLUENCES
Besides negative pressures, designs are shaped by other influences from stakeholders, the
development organization, and other factors (e.g., the designers own experiences).
These influences can have cyclical effects between the system and its external influences,
such that external factors affect the development of the system and the system affects its
external factors
Managing these influences is essential for maximizing the quality of systems and their
related influence on future business opportunities.
Of specific importance are design influences that come from the systems stakeholders and,
its developing organization
Software projects can have a multitude of stakeholders, each with specific wants and needs
that influence the software design.
Some conflicting with each other!
Each stakeholder believes he/she is correct.
This requires some design trade-offs to satisfy each customer --> Difficult to do in large scale
systems!
This is difficult to do since design decisions need to accommodate all concerns without
negatively affecting the project.
The developing organization also influences designs significantly.
Consider software development across site boundaries!
Consider coordinating design efforts
Consider conducting peer reviews
Consider developing code from design
Consider managing version control
In this case, design must support such development!
At the same time, designs can influence the developing organization to enter new areas of
business.
Consider a design that supports plug-ins to enhance capabilities of the software. Someone may
realize that existing functionality can be enhanced via plug-in to solve a problem in a different
domain, giving the developing organization a competitive advantage!
Managing these influences is challenging because it requires designers to span out of the
technical domain to have keen interest in the organization as a whole.
2. SOFTWARE DESIGN PROCESS

SOFTWARE ARCHITECTURE
Corresponds to a macro design approach for creating models that depict the quality and
function of the software system.
Provides black-box models used to evaluate the systems projected capabilities as well as its
expected quality.
Designed using multiple perspectives, therefore, allows different stakeholders with different
backgrounds to evaluate the design to ensure that it addresses their concerns.
It provides the major structural components and interfaces of the system
It focuses on the quality aspects of the system before detailed design or construction can begin.
They serve as important communication, reasoning, and analysis tool that supports the
development and growth of the system
Lays the foundation for all subsequent work in the software engineering lifecycle.

DETAILED DESIGN
Whereas software architecture deals with the major structural components and interfaces of
the system, detailed design focuses mostly on the internals of those components and interfaces.
Begins after the software architecture activity is specified, reviewed, and deemed sufficiently
complete.
Builds on the software architecture to provide a Builds on the software architecture to provide a
white-box approach to design approach to design the structure and behavior of the system
Refines the architecture to reach a point where the software design, including architecture
and detailed design, is deemed sufficiently complete for the construction phase to begin.
Focuses on functional requirements, whereas the architecture focuses mostly on nonfunctional,
or quality, requirements.
Two important tasks of the detailed design activity include:
Interface Design
Component Design

INTERFACE DESIGN

Refers to the design activity that deals with specification of interfaces between components in the
design.
Provide a standardized way for accessing services provided by software components.
Allow for multiple efforts to occur in parallel, as long as interfaces are obeyed, therefore, it is one
of the first tasks during detailed design.
Can be performed for both internal interfaces and external interfaces, e.g., XML messaging
specification for communication across the network.

COMPONENT DESIGN
During architecture, major components are identified. During component design, the internal
design of (the structure and behavior of) these components is created.
In object-oriented systems, using UML, component designs are typically in the form of class
diagrams, sequence diagrams, etc.
When creating these designs, several design principles, heuristics, and patterns are often used
in professional practice.
Sometimes referred to as component-level design.

DESIGN PRINCIPLES

MODULARIZATION
It is the principle that drives the continuous decomposition of the software system until fine-
grained components are created.
One of the most important design principle, since it allows software systems to be manageable at
all phases of the development life-cycle.
When you modularize a design, you are also modularizing requirements, programming, test
cases, etc.
Plays a key role during all design activities; when applied effectively, it provides a roadmap for
software development starting from coarse-grained components that are further modularized into
fine-grained components directly related to code.
Leads to designs that are easy to understand, resulting in systems that are easier to develop and
maintain.
Modularization is the process of continuous decomposition of the software system until fine-
grained components are created. But how do we justify the modularization engine?
It turns out that two other principles can effectively guide designers during this process
Abstraction
Encapsulation

ABSTRACTION
Abstraction is the principle that focuses on essential characteristics of entitiesin their
active contextwhile deferring unnecessary details.
While the principle of modularization specifies what needs to be done, the principle of
abstraction provides guidance as to how it should be done. Modularizing systems in ad-hoc
manner leads to designs that are incoherent, hard to understand, and hard to maintain.
Abstraction can be employed to extract essential characteristics of:
Data
Procedures or behavior
Procedural abstraction
Specific type of abstraction that simplifies reasoning about behavioral
operations containing a sequence of steps.

Data abstraction
Specific type of abstraction that simplifies reasoning about structural composition of data objects.

ENCAPSULATION
Principle that deals with providing access to the services of abstracted entities by exposing only
the information that is essential to carry out such services while hiding details of how the services
are carried out.
When applied to data, encapsulation provides access only to the necessary data of abstracted
entities, no more, no less.
Encapsulation and abstraction go hand in hand.
When we do abstraction, we hide details
When we do encapsulation, we revise our abstractions to enforce that abstracted entities only
expose essential information, no more, no less.
Encapsulation forces us to create good abstractions!

COUPLING
Refers to the manner and degree of interdependence between software modules.
Measurement of dependency between units. The higher the coupling, the higher the dependency
and vice versa
Content coupling
The most severe type, since it refers to modules that modify and rely on internal information of
other modules.
Common coupling
Refers to dependencies based on common access areas, e.g., global variables.
When this occurs, changes to the global area causes changes in all dependent modules.
Lesser severity than content coupling.
Data coupling
Dependency through data passed between modules, e.g., through function parameters.
Does not depend on other modules internals or globally accessible data, therefore, design units
are shielded from changes in other places.
In all cases, a high degree of coupling gives rise to negative side effects:
Quality, in terms of reusability and maintainability, decrease.
When coupling increase, so does complexity of managing and maintaining design units.

COHESION
The manner and degree to which the tasks performed by a single software module are related to
one another
Measures how well design units are put together for achieving a particular tasks.
Cohesion can be classified as:
Functional cohesion
Procedural (or sequential) cohesion
Temporal cohesion
Communication cohesion
High cohesion good, low cohesion bad

SEPARATION OF INTERFACE AND IMPLEMENTATION

Deals with creating modules in such way that a stable interface is identified and separated from
its implementation.
Not the same thing as encapsulation!
While encapsulation dictates hiding the details of implementation, this principle dictates their
separation, so that different implementations of the same interface can be swapped to provide
modified or new behavior

ARCHITECTURE

IMPORTANT QUALITY ATTRIBUTES OF SOFTWARE SYSTEMS

1. Usability - The degree of complexity involved when learning or using the system.

2. Modifiability - The degree of complexity involved when changing the system to fit current or
future needs.

3. Security - The systems ability to protect and defend its information or information system.
4. Performance - The systems capacity to accomplish useful work under time and resource
constraints.

5. Reliability - The systems failure rate.

6. Portability - The degree of complexity involved when adapting the system to other software
or hardware environments.

7. Testability - The degree of complexity involved when verifying and validating the systems
required functions.

8. Availability - The systems uptime.

9. Interoperability - The systems ability to collaborate with other software or hardware


systems.

REQUIREMENTS CLASSIFICATION
1. Functional vs. Non-Functional
Classification that differentiates between requirements that specify the functional aspects of the
system vs. The ones that place constraints on how the functional aspects are achieved.

2. Product vs. Process


Requirement placed on the system product vs. Requirements placed on the process employed to
build such product.

3. Imposed vs. Derived


Requirements imposed by stakeholders vs. Requirements that are derived by the development
team.

SPECIFICITY OF REQUIREMENTS
On being specific, requirements need to be specified in a clear, concise, and exclusive manner.
Clear requirements are not open to interpretation; unclear or ambiguous requirements lead to
incorrect designs, incorrect implementations, and deceptive validation during test.
Concise requirements are brief and to the point and are therefore easier to understand.
Exclusive requirements specify one, and only one thing, making them easier to verify.

CORRECTNESS OF REQUIREMENTS
On being correct requirements need to be correct in the sense that they must
accurately describe a desired system function.
In some cases, correctness of requirements is easily identified; in others, it is not.
Incorrect requirements can lead to incorrect or undesired behavior.
COMPLETENESS OF REQUIREMENTS
On being complete requirements must be complete both individually and as collective set.
This means that each requirement should be specified thoroughly so that it absolutely describes
the functions required to meet some need.
Collectively, requirements need to provide complete specification of the softwares required
functionality in the software requirements specification (SRS).
Incomplete requirements lead to incomplete designs, which in turn leads to incomplete
construction of the software system.
Requirements that are complete help clarify questions during construction and testing by
providing information necessary to disambiguate or prevent misinterpretations of required
functionality
Completeness is hard because it is not always obvious or it is sometimes too difficult to
determine when information is missing

CONSISTENCY, ATTAINABILITY AND VERIFIABILITY OF REQUIREMENTS


On being consistent requirements are consistent when they do not preclude the design or
construction of other requirements.
On being attainable requirements that are unattainable serve no purpose.
Attainability can be determined for both product and process.
On being verifiable perhaps the most obvious desirable characteristic of requirements.
Requirements that cannot be verified cannot be claimed as met.
Inability to verify requirements point to a serious flaw early on in the development process.

ARCHITECTURAL STYLES AND PATTERNS

DATA-CENTERD
Data-centered systems are systems primarily decomposed around a main central repository of
data. These include:
Data management component - controls, provides, and manages access to the systems
data.
Worker components - execute operations and perform work based on the data
Communication in data-centered systems is characterized by a one-to-one bidirectional
communication between a worker component and the data management component.
Worker components do not interact with each other directly; all communication goes
through the data management component

Because of the architecture of these systems, they must consider issues with:
Data integrity
Communication protocols between worker and data management
Transactions and recovery (also known as roll-back)
Security
A common architectural pattern for data-centered systems is the Blackboard Pattern.

DATA-FLOW
Data flow systems are decomposed around the central theme of transporting data (or data
streams) and transforming the data along the way to meet application-specific requirements.
Typical responsibilities found in components of data-flow systems include:
Worker components, those that perform work on data
Transport components, those that transporting data
Worker components abstract data transformations and processing that need to take
place before forwarding data streams in the system, e.g.,
Encryption and decryption
Compression and decompression
Changing data format, e.g: from binary to XML, from raw data to information, etc
Enhancing, modifying, storing, etc. of the data
Transport components abstract the management and control of the data transport
mechanisms, which could include:
Inter-process communication - Sockets, serial, pipes, etc.
Intra-process communication - Direct function call, etc.
An example of an architectural pattern for data flow systems is the Pipes-and-Filters.

DISTRIBUTED
Distributed systems are decomposed into multiple processes that (typically) collaborate through
the network.
These systems are ubiquitous in todays modern systems thanks to wireless, mobile, and internet
technology
These types of distributed systems are easy to spot, since their deployment architecture entails
multiple physical nodes.
However, with the advent of multi-core processors, distributed architectures are also relevant to
software that executes on a single node with multiprocessor capability.
Examples: Internet systems, web services, file- or music-sharing systems, high performance
systems, etc.
Common architectural patterns for distributed systems include: Client-Server
Pattern and Broker Pattern

INTERACTIVE
Interactive systems support user interactions, typically through user interfaces.
When designing these systems, two main quality attributes are of interest:
Usability
Modifiability
The mainstream architectural pattern employed in most interactive systems is the Model-
View-Controller (MVC).
The MVC pattern is used in interactive applications that require flexible incorporation of
human-computer interfaces. With the MVC, systems are decomposed into three main types
of components:
Model represents the system's core, including its major processing capabilities and data
View represents the output representation of the system
Controller associated with a view, handles user inputs

HIERARCHICAL
Hierarchical systems can be decomposed and structured in hierarchical fashion. Two common
architectural patterns for hierarchical systems are:
Main program and subroutine
Layered
Quality properties of the Main Program and Subroutine architectural pattern include:
Modifiability
Reusability
Quality properties of the Layered architectural pattern include the ones specified below:
Modifiability
Reusability
Portability
Security

PRINCIPLES OF DETAILED DESIGN OPEN-CLOSED PRINCIPLE


The Open-Closed principle (OCP) is an essential principle for creating reusable and modifiable
systems that evolve gracefully with time.
Software designs should be open to extension but closed for modification.
The main idea behind the OCP is that code that works should remain untouched and that new
additions should be extensions of the original work.
Being close to modifications does not mean that designs cannot be modified; it means that
modifications should be done by adding new code, and incorporating this new code in the system
in ways that does not require old code to be changed!

LISKOV SUBSTITUTION PRINCIPLE


The LSP was originally proposed by Barbara Liskov and serves as basis for creating designs
that allows clients that are written against derived classes to behave just as they would have if they
were written using the corresponding base classes.
The LSP requires:
Signatures between base and derived classes to be maintained
Subtype specification supports reasoning based on the super type specification
In simple terms, LSP demands that "any class derived from a base class must honor any
implied contract between the base class and the components that use it.
The Signature Rule ensures that if a program is type-correct based on the super type
specification, it is also type-correct with respect to the subtype specification.
The Method Rule ensures that reasoning about calls of super type methods is valid even
though the calls actually go to code that implements a subtype.

INTERFACE SEGREGATION PRINCIPLE


Well designed classes should have one (and only one) reason to change.
The interface segregation principle (ISP) states that "clients should not be forced to depend on
methods that they do not use

You might also like