You are on page 1of 1

Advances in the dataflow computational

model (Najjara, Leeb, & Gaoc, 1999)


The research work is all about the application of dataflow model of computing in tech world – from
software to hardware designs. Dataflow program graphs is used to represent the behavior or
architecture of a programs.

The first two specified dataflow that emerged in 1970s is Dennis dataflow graph and Kahn
process networks. The Dennis dataflow graph - operations are specified by actors that are enabled just
when all actors that produce required data have completed their execution. The dependence
relationships between pairs of actors are defined by the arcs of a graph, which maybe thought of as
conveying results of an actor to successor actors, and by the firing rules, which specify exactly what data
are required for an actor to fire. In Kahn process networks, this model replaces actors with sequential
processes. These processes communicate by sending messages along channels that conceptually consist
of unbounded FIFO queues. Dennis dataflow was originally applied to computer architecture design,
Kahn dataflow was used by concurrency theorists for modeling concurrent software. Multithreaded
architectures, with dataflow roots, use a style of dataflow that can be viewed as having elements of
both.

There are two forms of dataflow architecture have become known: In a static architecture, the
arc connecting one instruction to another can contain only a single result value (a token) from the
source instruction. In this scheme there can be only one instance of a dataflow actor in execution at any
time. In a dynamic dataflow architecture, tags are conceptually or actually associated with tokens so
that tokens associated with different activations of an actor may be distinguished. This enables arcs to
simultaneously carry multiple tokens, thereby exposing more data parallelism.

dataflow computers provide an efficient and elegant solution to the memory latency and
synchronization overhead, tolerate latency, by switching dynamically between ready computation
threads, and to support low overhead distributed synchronization in hardware.

As a computer engineer in practice the architecture and behavior of every computer design that
we’ll make is very crucial in development whether software or hardware, understanding the importance
and purpose of dataflow will help us to build a system with a good dataflow and architecture from the
scratch.

As the technologies is emerging and evolving so fast, I think the best next idea that can be
derived from this work in how the different dataflow program design can be combined together to
create a hybrid system that has the functionality of 2 or more systems.

You might also like