Professional Documents
Culture Documents
1. Requirements volatility
Requirements’ volatility is a major reason for the
complexity of software projects.
The common view is that software can be modified
“easily”:- Requirements are not fully specified
before design and construction -Requirements are
changed after design and construction
In software projects, although much effort is put
into the requirements phase to ensure that
requirements are complete and consistent, that is
rarely the case.
This makes the software design phase the most
influential one when it comes to
minimizing the effects of new or changing
requirements.
This forces designers to create designs that
provide solutions to problems at a
given state while also anticipating changes and
accommodating them with
minimal effort.
Because of requirements’ volatility, software
engineers must have a strong understanding of the
principles of software design and develop skills to
manage complexity and change in software projects.
2. Inconsistent development process
Software engineering is a process-oriented field. In
the design phase, processes involve a set of
activities and tasks required to bridge the gap
between requirements and construction. For
example:
Architectural and detailed designs
Design reviews
Establishing quality evaluation criteria
Establishing design change management and
version control
Adopting design tools
The problem is that in many cases, a company’s
design process
is not well established,
is poorly understood,
is approached with minimalistic expectations,
is focused on one form of design, e.g., user
interface, while ignoring others, or,
is simply, not done at all!
3. Fast and ever-changing technology
The technology for designing and implementing
today’ software systems continues to evolve to
provide improved capabilities. For example,
Modeling languages and tools
Programming languages
Integrated development environments
Design strategies, patterns, etc.
As new technologies emerge, software designers are
required to assimilate
and employ them all at the same time.
In some cases, old and new technology needs
to coexist in the same project!
This creates a demand for capable designers that can
assimilate new concepts and technology quickly and
effectively.
1
This is challenging because of the time
required for both learning new technology and
completing a project on-time, while making sure
that the new technology interoperates well with old
legacy systems.
4. Ethical and professional practices
Designers create blueprints that drive the
construction of software.
Tight schedules can create external pressures
to deviate from the formal design process to get the
product out the door.
In some cases, this can have catastrophic
consequences
To correctly and responsibly execute the
design phase, designers are required to exert strong
leadership skills to:
Influence and negotiate with stakeholders
Motivate the development team
Enforcing ethical guidelines
Evaluate the social impacts of their designs in
the public domain or in safetycritical systems
Follow and enforce the ethical and professional
practices
This is a challenge when attempting under
numerous pressures from different stakeholders,
e.g., management, customer, peers, etc.
5. Managing design influences
Besides negative pressures, designs are shaped by
other influences from stakeholders, the development
organization, and other factors (e.g., the designers’
own experiences).
These influences can have cyclical effects between
the system and its external influences, such that
external factors affect the development of the
system and the system affects its external factors.
Managing these influences is essential for
maximizing the quality of systems and their related
influence on future business opportunities.
Of specific importance are design influences that
come from
the system’s stakeholders and,
its developing organization
Software projects can have a multitude of
stakeholders, each with specific wants and needs
that influence the software design.
Some conflicting with each other!
Each stakeholder believes he/she is correct.
This requires some design trade-offs to satisfy each
customer.
Difficult to do in large-scale systems!
This is difficult to do since design decisions need to
accommodate all concerns without negatively
affecting the project.
The developing organization also influences designs
significantly.
Consider software development across site
boundaries!
Consider coordinating design efforts
Consider conducting peer reviews
Consider developing code from design
Consider managing version control
In this case, design must support such
development!
2
At the same time, designs can influence the
developing organization to enter new areas of
business.
Consider a design that supports plug-ins to
enhance capabilities of the software.
Someone may realize that existing functionality can
be enhanced via plug-in to solve a problem in a
different domain, giving the developing
organization a competitive advantage!
Managing these influences is challenging
because it requires designers to span
out of the technical domain to have keen interest in
the organization as a whole.
6. Software architecture
Corresponds to a macro design approach for
creating models that depict the quality and function
of the software system.
Provides black-box models used to evaluate
the system’s projected capabilities as well as its
expected quality.
Designed using multiple perspectives,
therefore, allows different stakeholders with
different backgrounds to evaluate the design to
ensure that it addresses their concerns.
It provides the major structural components
and interfaces of the system.
It focuses on the quality aspects of the system
before detailed design or construction can begin.
They serve as important communication,
reasoning, and analysis tool that supports the
development and growth of the system
Lays the foundation for all subsequent work in
the software engineering lifecycle.
7. Detailed design
Whereas software architecture deals with the
major structural components and interfaces of the
system, detailed design focuses mostly on the
internals of those components and interfaces.
Begins after the software architecture activity
is specified, reviewed, and deemed sufficiently
complete.
Builds on the software architecture to provide a
white-box approach to design the structure and
behavior of the system.
Refines the architecture to reach a point where
the software design, including architecture and
detailed design, is deemed sufficiently complete for
the construction phase to begin.
Focuses on functional requirements, whereas
the architecture focuses mostly on non-functional,
or quality, requirements.
Two important tasks of the detailed design
activity include:
Interface Design
Component Design
8. Interface design
Refers to the design activity that deals with
specification of interfaces between components in
the design.
Provide a standardized way for accessing
services provided by software components.
Allow for multiple efforts to occur in parallel,
as long as interfaces are obeyed, therefore, it is one
of the first tasks during detailed design.
Can be performed for both internal interfaces
and external interfaces, e.g., XML messaging
specification for communication across the network
3
9. Component design
During architecture, major components are
identified. During component design, the internal
design of (the structure and behavior of) these
components is created.
In object-oriented systems, using UML,
component designs are typically in the form of class
diagrams, sequence diagrams, etc.
When creating these designs, several design
principles, heuristics, and patterns are often used in
professional practice.
Sometimes referred to as component-level
design.
10. Modularization
It is the principle that drives the continues
decomposition of the software system until fine-
grained components are created.
One of the most important design principle,
since it allows software systems to be manageable at
all phased of the development life-cycle.
When you modularize a design, you are also
modularizing requirements, programming, test
cases, etc.
Plays a key role during all design activities;
when applied effectively, it provides a roadmap for
software development starting from coarse-grained
components that are further modularized into fine-
grained components directly related to code.
Leads to designs that are easy to understand,
resulting in systems that are easier to develop and
maintain.
Modularization is the process of continuous
decomposition of the software system until fine-
grained components are created. But how do we
justify the “modularization engine”?
It turns out that two other principles can
effectively guide designers during this process
Abstraction
Encapsulation
11. Abstraction
Abstraction is the principle that focuses on
essential characteristics of entities—in their active
context—while deferring unnecessary details.
While the principle of modularization specifies
what needs to be done, the principle of abstraction
provides guidance as to how it should be done.
Modularizing systems in ad-hoc manner leads to
designs that are incoherent, hard to understand, and
hard to maintain.
Abstraction can be employed to extract
essential characteristics of:
Data
Procedures or behavior
Procedural abstraction
Specific type of abstraction that simplifies
reasoning about behavioral operations containing a
sequence of steps.
We use this all the time, e.g., consider the
statement “Computer 1 SENDS a message to server
computer 2”
Image if we had to say, e.g., “Computer 1
retrieves the server’s information, opens a TCP/IP
connection, sends the message, waits for response,
and closes the connection.” Luckily, the procedural
abstraction SEND helps simplify the operations so
that we can reason about this operations more
efficiently.
4
Data abstraction
Specific type of abstraction that simplifies
reasoning about structural composition of data
objects.
In the previous example, MESSAGE is an
example of the data abstraction; the details of a
MESSAGE can be deferred to later stages of the
design phase.
12. Encapsulation
Principle that deals with providing access to
the services of abstracted entities by exposing only
the information that is essential to carry out such
services while hiding details of how the services are
carried out.
When applied to data, encapsulation provides
access only to the necessary data of abstracted
entities, no more, no less.
Encapsulation and abstraction go hand in hand.
When we do abstraction, we hide details…
When we do encapsulation, we revise our
abstractions to enforce that abstracted entities only
expose essential information, no more, no less.
Encapsulation forces us to create good
abstractions!
13. Cohesion and coupling
Cohesion
The manner and degree to which the tasks
performed by a single software module are related
to one another.
Measures how well design units are put
together for achieving a particular tasks.
Cohesion can be classified as:
Functional cohesion
Procedural (or sequential) cohesion
Temporal cohesion
Communication cohesion
Coupling
Refers to the manner and degree of
interdependence between software modules.
Measurement of dependency between units.
The higher the coupling, the higher the dependency
and vice versa.
Important types of coupling include:
Content coupling
The most severe type, since it refers to
modules that modify and rely on internal
information of other modules.
Common coupling
Refers to dependencies based on common
access areas, e.g., global variables.
When this occurs, changes to the global area
causes changes in all dependent modules.
Lesser severity than content coupling.
Data coupling
Dependency through data passed between
modules, e.g., through function parameters.
Does not depend on other modules’
internals or globally accessible data, therefore,
design units are shielded from changes in other
places.
In all cases, a high degree of coupling gives
rise to negative side effects.
Quality, in terms of reusability and
maintainability, decrease.
5
When coupling increase, so does complexity
of managing and maintaining design
units.
14. Separation of interface and implementation
Deals with creating modules in such way that a
stable interface is identified and separated from its
implementation.
Not the same thing as encapsulation!
While encapsulation dictates hiding the details
of implementation, this principle dictates their
separation, so that different implementations of the
same interface can be swapped to provide modified
or new behavior.
15. Sufficiency and completeness
Sufficiency
Deals with capturing enough characteristics of
the abstraction to permit
meaningful interaction
Must provide a full set of operations to allow a
client proper interaction with the abstraction.
Implies minimal interface
Completeness
Deals with interface capturing all the essential
characteristics of the abstraction.
Implies an interface general enough for any
prospective client
Completeness is subjective, and carried too far
can have unwanted results.
16. Important quality attributes of software
systems
Some important quality attributes of software
systems:
Usability: The degree of complexity involved when
learning or using the system.
Modifiability: The degree of complexity involved
when changing the system to fit current or future
needs.
Security: The system’s ability to protect and defend
its information or information system.
Performance: The system’s capacity to accomplish
useful work under time and resource constraints.
Reliability: The system’s failure rate.
Portability: The degree of complexity involved
when adapting the system to other software or
hardware environments.
Testability: The degree of complexity involved
when verifying and validating the system’s required
functions.
Availability: The system’s uptime.
Interoperability: The system’s ability to collaborate
with other software or hardware systems.
Notice that these quality attributes also
describe high-level information about desired
characteristics of the software system.
In their current form, they are not sufficient to
develop the system.
For a system to exhibit any of these qualities,
design decisions must be made to support the
achievement of these qualities. These design
decisions are referred by Bass, Clements, and
Kazman as Tactics.
6
17. Requirements classification determine when information is missing.
Performed for identifying the nature of each
requirement
Functional vs. non-functional: Classification
that differentiates between requirements that specify
the functional aspects of the system vs. the ones that
place constraints on how the functional aspects are
archived.
Product vs. Process: Requirement placed on
the system product vs. requirements placed on the
process employed to build such product.
Imposed vs. derived: requirements imposed by
stakeholders vs. requirements that are derived by the
development team.
18. Specificity of requirements
On being specific,
Requirements need to be specified in a clear,
concise, and exclusive manner.
Clear requirements are not open to
interpretation; unclear or ambiguous requirements
lead to incorrect designs, incorrect implementations,
and deceptive validation during test.
Concise requirements are brief and to the point
and are therefore easier to understand.
Exclusive requirements specify one, and only
one thing, making them easier to verify.
7
determine when information is missing.
21. Consistency, attainability, and verifiability of
requirements
On being consistent,
Requirements are consistent when they do not
preclude the design or construction of other
requirements.
On being attainable,
Requirements that are unattainable serve no
purpose.
Attainability can be determined for both
product and process.
On being verifiable,
Perhaps the most obvious desirable
characteristic of requirements.
Requirements that cannot be verified cannot be
claimed as met.
Inability to verify requirements point to a
serious flaw early on in the development process
22. Data-centered
Data-centered systems are systems primarily
decomposed around a main central
repository of data. These include:
Data management component
The data management component controls,
provides, and manages access to the system’s data.
Worker components
Worker components execute operations and
perform work based on the data.
Communication in data-centered systems is
characterized by a one-to-one bidirectional
communication between a worker component and
the data management component.
Worker components do not interact with each
other directly; all communication goes through the
data management component.
Because of the architecture of these systems, they
must consider issues with:
Data integrity
Communication protocols between worker and
data management
Transactions and recovery (also known as roll-
back)
Security
A common architectural pattern for data-centered
systems is the Blackboard Pattern.
Blackboard decomposes systems into components
that work around a central data component to
provide solutions to complex problems.
These components work independently from
each other to provide partial solutions to problems
using an opportunistic problem-solving approach.
That is, there are no predetermined, or correct,
sequences of operations for reaching the
problem’s solution.
Quality properties of the Blackboard architectural
pattern include the ones specified below:
Modifiability: Agents are compartmentalized and
independent from each other; therefore. It is easy to
add or remove agents to fit new systems.
Reusability: Specialized components can be reused
easily in other applications.
Maintainability: Allows for separation of concerns
and independence of the knowledge based agents;
therefore, maintaining existing components
becomes easier.
8
An important aspect of the Blackboard and any
other architectural pattern is their deployment aspect
(i.e., the deployment view). For example, It is not
easily determined from the logical view where each
agent or blackboard component reside.
Depending on their location, Blackboard can
have increased complexity when managing
communication between agents, controller, and
blackboard.
23. Data-flow
Data flow systems are decomposed around the
central theme of transporting data (or data
streams) and transforming the data along the way to
meet application-specific requirements.
Typical responsibilities found in components
of data-flow systems include:
Worker components, those that perform
work on data
Transport components, those that
transporting data
Worker components abstract data transformations
and processing that need to take place before
forwarding data streams in the system, e.g.,
Encryption and decryption
Compression and decompression
Changing data format, e.g. ,from binary to
XML, from raw data to information, etc.
Enhancing, modifying, storing, etc. of the data
Transport components abstract the management and
control of the data transport
mechanisms, which could include:
Inter-process communication
Sockets, serial, pipes, etc.
Intra-process communication
Direct function call, etc.
An example of an architectural pattern for data flow
systems is the Pipes-and-Filters.
Pipes-and-Filters is composed of the following
components:
Data source
Produces the data
Filter
Processes, enhances, modifies, etc. the data
Pipes
Provide connections between data source
and filter, filter to filter, and filter to data sink.
Data Sink
Data consumer
Quality properties of the Pipes-and-Filters
architectural pattern include the
ones specified below:
Extensibility: Processing filters can be added easily
for more capabilities.
Efficiency: By connecting filters in parallel,
concurrency can be achieved to reduce latency in
the system.
Reusability: By compartmentalizing pipes and
filters, they can both be reused as-is in other system.
Modifiability: Filters are compartmentalized and
independent from each other; therefore, it is easy to
add or remove filters to enhance the system.
Security: At any point during data-flow, security
components can be injected to the work-flow to
provide dif ferent types of security mechanisms to
the data
9
Maintainability: Allows for separation of concerns
and independence of the Filters and Pipes; therefore,
maintaining existing components becomes easier.
24. Distributed
Distributed systems are decomposed into multiple
processes that (typically) collaborate through the
network.
These systems are ubiquitous in today’s
modern systems thanks to wireless, mobile, and
internet technology.
In some distributed systems, one or more
distributed processes perform work on behalf of
client users and provide a bridge to some server
computer, typically located remotely and
performing work delegated to it by the client part of
the system.
Other distributed systems may be composed
of peer nodes, each with similar capabilities and
collaborating together to provide enhanced services,
such as music-sharing distributed applications.
These types of distributed systems are easy to
spot, since their deployment architecture entails
multiple physical nodes.
However, with the advent of multi-core
processors, distributed architectures are also
relevant to software that executes on a single node
with multiprocessor capability.
Some examples of distributed systems include:
Internet systems, web services, file- or music-
sharing systems, high-performance systems, etc.
Common architectural patterns for distributed
systems include:
Client-Server Pattern
Broker Pattern
Quality properties of the Blackboard architectural
pattern include the ones specified below:
Interoperability: Allows clients on different
platforms to interoperate with servers of different
platforms.
Modifiability: Allows for centralized changes in the
server and quick distribution among many clients.
Reusability: By separating server from clients,
services or data provided by the server can be
reused in different applications.
25. Interactive
Interactive systems support user interactions,
typically through user interfaces.
When designing these systems, two main
quality attributes are of interest:
Usability
Modifiability
The mainstream architectural pattern employed in
most interactive systems is the Model-View-
Controller (MVC).
The MVC pattern is used in interactive applications
that require flexible
incorporation of human-computer interfaces. With
the MVC, systems are
decomposed into three main types of components:
Model: Component that represents the system’s
core, including its major processing capabilities and
data.
View: Component that represents the output
representation of the system (e.g.: graphical output
or console-based)
10
Controller; Component (associated with a view) that
handles user inputs.
26. Hierarchical
Hierarchical systems can be decomposed and
structured in hierarchical fashion. Two common
architectural patterns for hierarchical systems are:
Main program and subroutine
Layered
Quality properties of the Main Program and
Subroutine architectural pattern include:
Modifiability: By decomposing the system into
independent, single purpose components, each
component becomes easier to understand and
manage.
Reusability: Independent, finer grained components
can be reused in other systems.
Quality properties of the Layered architectural
pattern include the ones specified below:
Modifiability: Dependencies are kept local within
layer components. Since components can only
access other components through a well-defined and
unified interface, the system can be modified easily
by swapping layer components with other enhanced
or new layer components.
Portability: Services that deal directly with
platform’s API’s can be encapsulated using a
system layer component. Higher level layers rely on
this component for providing system services to the
application, therefore, by porting the system’s API
layer to other platforms systems become more
portable.
Security: The controlled hierarchical structure of
layered systems allow for easy incorporation of
security components to encrypt/decrypt
incoming/outgoing data.
11
OCP is to locate the areas of the software that are
likely to vary and the variations can be encapsulated
and implemented through polymorphism.
12