You are on page 1of 10

 

Computers Chemical Engineering Vol. , o. , pp. 48-91, 1983 009~1354/83 3.00 + .OO
Printed in reat Britain. Q 1983 Perqmon Press Ltd.

COMPUTER TECHNOLOGY IN PROCESS SYSTEMS ENGINEERING

R. L. Motard

Department of Chemical Engineering, Washington University


St. Louis, Missouri, 63130 USA

Abstract. This paper looks at the future of computer aided process systems
engineering. Productivity issues are discussed with respect to software,
computer design technology, engineering relational data base management
systems, and data independent programming. Very large systems integration
technology will have a major impact on the structure of process systems
engineering software. Standards are proposed for host languages and host
operating systems. Expert systems will find an increasing role.

Keywords. Software, very large scale integration, data base management


systems, computer aided design, expert systems.

INTRODUCTION density of imperfections, which in turn


affects the yield of acceptable chips in
In the past 15 years chemical engineers mass production.
have experienced a minor revolution in the
practice of process systems engineering due The packing density of circuit elements on
to the expanding role of computer tech- a chip is only one of the factors affect-
nology. It is difficult to discuss the ing cost. There are engineering problems
past successes of computer applications to associated with the integration of these
our field when one is so aware of the tre- devices into working computers. The
mendous promises of the future. This energy dissipated must be removed by suit-
paper will emphasize primarily the future able packaging. The design of very large
as an evolution from the past and present. scale integrated (VLSI) elements is so
complex that progress is keyed to the
We have all read the projections of poten- development of computer aided design tools.
tial increases in packing density of com- Typically, (LaBrecque 19821, a micro-
puter and memory silicon chips, in antici- electronic system design begins with a
pation of hardware cost reduction. As of system architect who produces a block
1982 we are told that memory chips may diagram meeting the functional requirements.
ultimately reach a density of 1 - 10 A logic designer translates the block
megabytes per chip versus 256K today. diagram into logic and gate specifications.
Microprocessor chips are now being develop- A circuit designer then produces circuit
ed with a half million transistors and may diagrams from these specifications. A lay-
reach 2 - 10 million transistors before out designer places the circuit diagrams
1990. At some point silicon technology onto a silicon chip and a draftsman pre-
will reach its limit and the historical pares the masks from which the chip will
trends in computer prices will bottom out. be manufactured. The process of digital
Currently, the semiconductor industry is system design certainly has its analog in
still projecting a 15% per year decrease in the process systems engineering enterprise.
cost/performance of large scale general Obviously, the cost of design must be con-
purpose machines and 25% per year for sidered. In today's economy, with state-
small-scale general purpose machines, ex- of-the-art design technology, the design
pressed in $/million-instructions-per- of a new chip will cost $1 million for
second (Branscomb 1982). The limit of simple devices and up to $20 million for
silicon technology will probably be reached complex microprocessor chips. The semi-
within the next few years. Where it conductor industry today is forced to manu-
bottoms out depends on the ability of facture in very large volumes to recover
silicon substrate manufacturers to produce the high cost of design and the high cost
ultrapure silicon economically. This is a of equipment to manufacture high-density
chemical engineering problem in its own devices. New technologies such as optical
right. The ability to shrink electronic electronic digital systems promise further
elements and their interconnections onto reductions in hardware size, and biocom-
silicon is critically dependent on the puters, now only a gleam in the biotech-
nologist's eye, may eventually reduce
 

484 R L MOTARD

*'electronic" elements to molecular scale. 483 figuration table to establish the inter-
However, any displacement of silicon tech- face with the application environment.
nology will have to challenge the inertia They supply well-established functions
that an entrenched, mature and ultra-re- which don't require modification from one
fined technology acquires in the industry. system to another. Typically, a silicon-
All of this progress in miniaturization based operating system adds 20 or more
serves to aggravate the two main problems high-level instructions to the basic in-
impeding further progress, namely software struction set of the microcomputer.
and design technology. Peripheral software routines will also
allow operating system calls from higher
level languages like PASCAL. Software-in-
THE SOFTWARE PROBLEM silicon is currently a marketing strategy,
allowing hardware manufacturers to enter
The development of information systems of the very large value-added market in soft-
interest to process systems engineers is ware. Nevertheless, interest will grow,
critically dependent on the availability of quality will improve, and it is possible
software. While hardware cost is decreas- to speculate that the range of such
ing a million-fold between 1955 and 1985, products will grow to include telecommuni-
programmer productivity will increase only cation functions, networking functions,
by a factor of 4 (Birnbaum 1982). Another high-level language compilers, graphics
view of the software problem is offered by functions and other application programs.
the observation that in 1985 the cost of
leasing 1 million instructions per second Software development tools and methodology
of hardware capability will fall below the is another productivity factor. These im-
salary of a professional programmer for the provements in software production result in
same period of time (Branscomb 1982). part from a decomposition of the develop-
Programming costs now approach 85% of total ment process and in part from the availa-
user costs for a functioning computer bility of management procedures. Invari-
facility. ably the development sequence involves a
layering of languages and a collection of
The key to future progress in the growth of utility programs. For example, the porta-
computer use is programmer productivity. bility of the UNIX operating system
This can happen in several ways. For in- (Kernighan and Morgan 1982) to diverse
stance, the availability of personal hardware systems is greatly enhanced be-
microcomputers has spawned a cottage pro- cause UNIX was written in the C language
gramming industry, with new software (Kernighan and Ritchie 1978), a lower-level
products capturing surprising market shares system development language. C is a
in unexpected ways. Two examples of the machine-dependent language with machine
latter are VISICALC, a financial spread dependencies carefully tailored to be
sheet planning tool which has become a adaptable to a universe of machine archi-
standard personal computer package from tectures, and restricted to about 5% of the
very modest beginnings. CP/M, the standard compiler code. The UNIX part is invariant,
operating system for zilog's ~80 micro- and all one needs is a C compiler for the
processor has had a similar history. target machine, itself written in C. The
Further improvements in programming pro- portability of PASCAL depends on the avail-
ductivity depends on three developments ability of a p-code interpreter for the
(Bacon 1982): 1) Software building,blocks target machine since PASCAL is compiled
for building system software 2) Software to p-code. Again the PASCAL part is in-
development tools and methods, and 3) Higher variant, and the interpreter to implement
levels of abstraction in program-forming p-code interpretation or translation is far
operations with strong algebraic properties. easier to do quickly and correctly as a
lower level machine-dependent language.
Engineers will accept as fundamental the Among the other tools that complete the
philosophy that an inventory of standard development process we include special
parts with suitable interface disciplines journalizing editors, source-language-level
between them can be the basis for composing debuggers, documentation facilities, specia
more complex programs. Because the parts linkers, and compilation and configuration
would be rigorously verified and tested, databases. The management of large soft-
high quality programs might be quickly ware projects is further enhanced by such
assembled. This has been successful in automated tools as formal specification and
application programming but not in systems design languages, text managers (filing
software. There are two problems, namely systems), configuration managers, test data
determining the right set of standard generators, flow analyzers, etc.(Howden 198:
blocks and providing a proper environment
for their interconnection. Nevertheless, Program forming operations are inhibited
microprocessor operating systems are now today by our inability to construct
being provided in read only memory ROM) programs in terms which fit the problem
chip sets (Lettieri 1982). Some are hard- rather than at the detail required
ware dependent but others are processor- for the machine to implement the functions
independent, requiring only a few external associated with the problem (Bacon 1982).
parameters such as a user-supplied con-
 

Computer technology in process systems engineering

There is need for further research on the done. It is the only way to truly harness
development of programming languages with the power of VLSI and avoid the cross-chip
strong algebraic properties relating the communication dilemmna.
primitive functions of the language. With
strong program-forming operations a The dawn of powerful special-purpose
rigorous approach might be found for build- machines will require computer aids to
ing new programs from old ones. The prin- assist the computer architect to divide
cipal barrier to these developments is the problem-oriented computation into subunits.
architecture of existing computers which What better opportunity for process
force each program to be concerned with the systems engineers to become engaged in the
detailed assignment and manipulation of design of computer architectures specifi-
storage. This leads to inconsistent views cally implemented to solve chemical pro-
of data. There may be some relief in sight cess design problems? There are a host of
due to the increasing interest in relation- questions to be addressed, all with the
al data base management systems which offer objective of casting process analysis,
data independent programming environments, design and simulation into silicon. The
a subject to be discussed later. use of chemical engineers in this fashion
is not as far-fetched as it may seem.
Carver Mead of Caltech and Lynn Conway of
COMPUTER DESIGN TECHNOLOGY Xerox have designed a one semester graduate
course which teaches graduate students with
We have already discussed the shrinking backgrounds typical of computer science
size of circuit elements on silicon chips. rather than electrical engineering how to
There are two advantages of shrinking design VLSI chips. Such backgrounds would
transistors on chips (LaSrecque 1982). not be foreign to engineers from disci-
They switch faster and consume less energy. plines other than electrical engineering.
Reducing the linear dimension of a tran- The entire process can be automated like a
sistor element by 2 increases the packing computer aided design project and produces
density 4-fold and switching speed is chip designs only slightly less optimal
doubled, a substantial increase in com- than those produced by experts in a very
puting power. But, the increasing com- short time, compared to the many man-
plexity of very large scale integrated months required when done by traditional
(VLSI) circuits poses another problem in methods. The entire approach is bound to
productivity whose solution has substantial be synergistic as new special-purpose chip
implications for process systems engineer- designs are produced which assist in the
ing. design of new VLSI systems. One such
development is a geometry engine contain-
As circuit elements become smaller and ing a half million transistors which per-
faster, cross-chip communication delay forms VLSI circuit layout. Indeed it is
becomes a bottleneck. Most of the energy thought that the entire VLSI process can
consumed in chips is in communication and be automated, between a behavioral descrip-
the wires take up 95% of the chip area as tion of the computational problem and a
well. The basic von Neumann architecture set of circuit masks. Prototype software
of computers consisting of one memory, one and hardware systems now produce chips with
input channel, one output channel and one 20 to 30% more area than optimal designs.
processor makes computation a single, Once the circuit masks are available there
sequential process. This poses serious are any number of resource centers in the
difficulties in harnessing the technology US which can deliver the actual VLSI cir-
of VLSI and perhaps offers an opportunity cuit in four to six weeks. The key to
to redefine the architecture of computing such design productivity is computer aided
systems. Architectural issues involve two design and a well-structured design tech-
approaches to decomposition in computation. nology. The economic conditions which
One approach is to break up the process dictated high volume production of chips
into independent or parallel parts which disappear when the cost of design is
can be performed concurrently in ensembles radically deflated. Short run production
of small processors. Another approach is of special purpose computers designed for
to use systolic arrays, the so-called data process engineering tasks become econo-
flow machines, in which the data flows mically feasible.
rhythmically through several simple com-
putational cells before it returns to
memory. Arrays, which are either linear DATA MANAGEMENT
or two-dimensional, can achieve higher
degrees of parallelism. Data may flow in Having briefly reviewed some of the hard-
an array at multiple speeds and in multi- ware issues and related problems, we have
ple directions. The key to parallel decom- raised some questions about data organi-
position or array decomposition of compu- zation forced on the software developer by
tational tasks is to find the appropriate a universal dependence on von Newmann com-
regularity in the application problems to puter architecture. There are two solution:
be solved. Each problem would have its own to this problem, one involving software and
special-purpose machine, with VLSI chips the other, hardware. The first is easily
designed to do exactly what people want perceived but the latter only dimly. Let
 

R L MOTARD

CACE
us Vol. , o.
first -s
approach the problem of data from ing programs. Large changes in data
a software solution. structures generate large reprogramming
projects. Not only is maintenance reduced
The history of computer aided design in but data independent programming should be
process systems engineering has been one of much more productive once the techniques
proliferation of individual engineering are mastered.
computer programs. Communication among
these stand-alone aids for the engineering The first step in data decoupling is to
process has not been solved satisfactorily. isolate the programmer from the data
Most chemical engineers involved in process storage model. This is achieved by re-
engineering are familiar with programs for placing positional addressing with asso-
process synthesis, process design, process ciative addressing. The programmer need
simulation, process optimization, reactor
and fractionator design, physical property stored or indeed if storage has been re-
estimation, heat exchanger and vessel de- organized. His principal concerns are the
sign, piping design, inventory and project associative relations among data, whether
control, bill of materials take-off and data are conceptually related on a one-to-
automated drafting. The entire mix takes many' or many-to-many level. He retrieves
on the aspect of an unmanageable complex of or stores data via a relation name (or
activities and resources to the average entity name), attribute name and attribute
project manager, who must meet ever more value (or key). Typically,
critical deadlines on project completion
with rapidly escalatinq costs of execution Value + Attribute (Entity)
and delays.
such as,
These problems are created by a program-
centered approach to large-scale computer 1OO'C + Temperature (Stream 5)
applications. In the past, the program and
its developer occupied the core of the In a truly relational data environment fur-
application process. The data were given a ther data, decoupling takes place between
secondary role in the development and the user (logical level) and the program
management of such engineering applications (conceptual level). There is thus a
of computers. Most of the development three-level description of data, the
effort was devoted to the program, and its storage level description being hidden from
elegance. NO great attention was paid to the programmer, the storage and conceptual
the value of the data in itself. Indeed levels being hidden from the user. The
most of the data ended up as a pile of com- user interface is a set of commands which
puter printouts. can be used for entering data, executing
application programs and generating re-
The solution to this complex problem is to ports.
adopt a data-centered development approach.
The data are put at the core of the The relational DBMS is an interface program
application process, hence the emphasis on between the applications and the data. It
data base management. After all, it is the has its own data sublanguaqe permitting the
data that become information and decision- insertion, deletion, retrieval and update
making depends on information; only second- of data along with data definition
arily on computer programs. The business facilities. It must permit algebraic set
community discovered long ago that their operations without resorting to iteration
daily activities could not depend on pro- or recursion. A complete relational
gram-centered management. The engineering algebra is derivable from SELECT, PROJECT
community must now seriously consider the and JOIN operators. Relational processing
movement to data base management system treats whole relations as operands, avoid-
(DBMS) technology to survive the complexity ing loops as we have said. In analyzing
problem. the operations it is convenient to think
of relations as tables with tuples
The first goal of data base management soft- (collections of attribute values) as rows
ware is to decouple programs from data (Codd and the attributes themselves as columns:
1982). As we have said earlier, the barrier Tuples are not position sensitive in the
to strong program forming operations is a tabular concept.
concern for the detailed assignment and
manipulation of storage in individual pro- The SELECT operator takes one relation as
grams. The decoupling of data makes the operand and produces a new relation (table)
development of data independent programming consistinq of selected tuples (rows) of the
conceivable. Today's problem with software first. The PROJECT operator also transform
is the high maintenance cost of application one relation into a new one, this time con-
programs, much of it due to the close coupl- sisting of selected attributes (columns)
ing of data and programs. Every small of the first. The JOIN operator takes two
change in data structure, as each application relations (tables) as operands and pro-
grows in sophistication and complexity, duces a third relation consisting of the
triggers a chain reaction of programming rows of the first concatenated with the
changes to maintain the viability of exist- rows of the second, but only where speci-
 

Computer technology in process systems engineering 487

fied columns (attributes) of the first have eliminates the need for coding and recod-
matching values with specified columns of ing of data from one phase of process en-
the second. The JOIN operator may or may gineering to the next. It promotes
not remove redundancy in columns (attri- efficient project management. When coupled
butes). The DBMS must support tables (re- to non-procedural command languages it pro-
lations), without user-visible navigation vides an electronic filing cabinet and an
links between them. The system provides electronic scratchpad. Process problems
the automatic navigation. can be solved incrementally, bit by bit,
working on one part of the project at a
In order to be useful to the process system time without sacrificing the coherence re-
engineering program developer, the data quired for the overall project. Discipline
sublanguage must be usable in two modes specialists can be alerted to design de-
(1) Interactively at a terminal and (2) cision changes as they occur, without the
Embedded in an application program written delay inherent in bureaucratic organi-
in a host language such as FORTRAN or zations. Reports can be generated in
PASCAL. Thus, application programmers can timely fashion. There is no need to stock-
separately debug at a terminal the database pile massive computer printouts, since a
operations that they wish to incorporate in new report can be tailored and produced as
their application programs, then embed the needed. Such a system can really become a
same statements in the host language to computer aid to the designer, supporting
complete the application program, rather than inhibiting experimentation and
case studies, continuously recording the
Beyond these simple ideas the complete DBMS data traffic in short or long computer
must provide the following services (Codd terminal sessions, filing multiple
1982): examples, and supporting a framework for
transmitting large volumes of system docu-
1. Data storage, retrieval and update mentation when coupled to a text process-
ing capability.
2. A user-accessible catalog of data de-
scriptions (schemas of relations and Data independent programs allow the con-
attributes) struction of ever more powerful and com-
plex process systems engineering resources.
3. Transaction support to monitor changes With regard to hardware solutions, relation-
in databases al DBMS offer great decomposition flexi-
bility when planning a distributed network
4. Recovery services in case of failure at of computer systems and great recomposition
the program, system or hardware level power for dynamic recombination of decen-
tralized information. Workstation-style
5. Concurrency control approaches to large scale design projects
become the natural mode of execution. As
6. Authorization services to ensure that
access and manipulation of data is con- local computing power will become available
trolled at both the user and program level at the engineer's desk. Network communi-
cation and DBMS will provide the under-
7. Integration with data communication pinning which makes the synergism between
services the engineer and the machine complete.
There are unresolved problems in process
8. Integrity services to ensure that data- systems DBMS. Very large data collections
base states and changes of state conform to lead to slow storage and retrieval. In a
specified rules. typical chemical process capital project,
it is estimated that there are one to two
Obviously, such extensive services makes gigabytes (109 bytes) of data per billion
the operation of the DBMS somewhat depend- dollars of plant investment (Perris 1981).
ent on a host operating system. Whether Such estimates have led some to speculate
one can offer such services irrespective that a complete process engineering data
of the operating system, thereby making base would be composed of separate data
the DBMS much more portable, is a question bases for each discipline (Cherry, Grogan,
for the future. Knapp and Perris 1982). Some manufacturers
are now offering data base machines which
We have stated that DBMS-based process accelerate the searching of disc files for
systems engineering separates data from data retrieval. What shape future hardware
programs. This leads to the survivability will take in the face of large data managed
of programs, data-independent programming, projects is a matter of speculation.
much lower maintenance costs for application
software, and much higher application pro- Host Language and Operating System
gramming productivity. What else does it
promise? Does it enhance the productivity Since we have alluded to host language and
of the chemical engineer? The answer is host operating systems in DBMS technology
that it does, in several ways. it is time that we say a few words about
the future of both. In our opinion,
In the first place, DBMS technology supported by our perception of widespread
 

R L MOTARD

interest in the matter, a language like UNIX


PASCAL and its offspring ADA, will become
the process systems engineering computer UNIX (Kernighan and Morgan 1982) is funda-
language of the future. FORTRAN has served mentally a single user operating system,
us well over two decades and successive although it is available on multiuser
generations of the language have enhanced systems. Without marketing impetus, it
FORTRAN with features to be found in a more has grown to 3000 systems world-wide ex-
integrated environment in PASCAL. PASCAL cluding microcomputer sites. It is now
is a well-structured language, with a sim- available on most major computing machines.
plicity which enhances its ability to de- It is basically an electronic filing
tect programming errors. It encourages the system. All files are treated merely as a
creation of portable programs, and modular stream of characters (or bytes) with no
programs. PASCAL as a language is easy to reference to hardware device character-
read and write, therefore programs are easy istics such as tracks, cylinders or blocks
to maintain. It is a language in which re- that typify other commercial operating
strictions have been introduced intention- systems. Associated with files is a
ally in order to reduce the number of de- hierarchy of directories which contain
cisions, hence the number of errors which information about other directories or
a programmer makes. ADA, which has its about files which helps to organize large
roots in PASCAL, is the result of a s-year collections of files. Input or output de-
effort on the part of the United States vices are handled in the same way as
Department of Defense to define a universal ordinary files, with the application pro-
system development language. While there gram being unaware of either the source or
have been some reservations about various destination of the data.
features of ADA which lack clear and pre-
cise definition or implementability, the UNIX is a multitasking system which means
international resources behind its develop- that a user may have several processes (or
ment will ultimately insure that a viable tasks) executing concurrently. Task schedu
version of the ADA language will survive. ing is handled by the system kernel which
PASCAL and ADA incorporate the work of also manages data storage. In addition to
early research on the properties of pro- a repository of utility programs available
gramming languages that help eliminate to the kernel, the system is controlled by
common coding errors. These allow the a command interpreter called the shell.
language to enforce assertions about the The shell accepts commands and interprets
range of variables and other properties. them as requests to run programs. Commands
These features provide a formal approach to may be pipelined to connect programs. For
verifying that the programs match the intent instance, the command
of the designer.
program < datalplot
We are not proposing that all old programs
be discarded. Certainly, it is possible will tell the system to run "program" using
to link PASCAL and FORTRAN routines. In a a data file called "data" for input and to
DBMS environment, old FORTRAN programs with connect the output to a program called
data allocation and conventional input- "plot".
output statements stripped out and replaced
by DBMS access statements can easily be in- program < data > lpr
tegrated into new computing environments.
All new developments however should be based would direct "program" output to a line
on PASCAL/ADA. printer. If data smoothing is required
before plotting,
Operating systems and DBMS are difficult to
disentangle. Nevertheless, one should program C data)spline)plot
examine the evolving area of operating
systems to identify any movements toward would achieve the desired result. Provided
standardization. We have said earlier that a family of development tools is available,
this is important from the point of view it is possible to write complex systems
of acquiring well-tested building blocks. without ever using a programming latiguage.
It is also important for the future movement Even the shell is a program and the command
into ROM-based, software-in-silicon operat-
ing systems and programs. If there is one sh < comds
candidate which is emerging as the industry
standard in this area it is the UNIX operat- causes the shell to take its commands from
ing system developed in the early 1970's at file "cmds".
Bell Laboratories. Perhaps with somewhat
less conviction we anticipate that all The value of UNIX as a very high level
future operating systems will be similar to system building environment results from
UNIX, especially in the type of workstations the decoupling of data and programs. The
that will be used by process systems en- uniform file interface promotes this. In
gineers. addition, the use of shell scripts and
pipelines promotes extensive modularity.
The modularity in turn promotes great
 

Computer technology in process systems engineering 489

flexibility and evolutionary changes. UNIX or engineering enterprise being localized


can be tailored to a wide diversity of in Yokohama, London or Los Angeles will no
environments. Another virtue of UNIX is longer be relevant. Task forces of people
its small size and clean structure which will be brought together through tele-
makes it very popular in university computer communication on a global basis.
science departments. It is spawning a whole
cottage industry of UNIX-based programs and The VSLI age will provide affordable
systems. Since the source code is dis- graphic processors, color raster scan de-
tributed with the system it continues to vices with 1024 x 1024 pixel resolution
gain in popularity. It has been adapted will grace every engineering office. These
to the new generation of 16-bit and 32-bit will be tied to very powerful virtual
computer chips. microcomputers allowing the engineer to
communicate with process engineering soft-
UNIX has found broad application in text ware via an electronic sketchpad. Super
processing, software development, laboratory computers capable of 100 million in-
automation, information systems involving struction executions per second will be
small databases and computer science edu- available at distributed service centers.
cation. A commercial relational database Nevertheless, a great deal of the compu-
system called INGRES is now available with tation load will be local, using parallel
a UNIX interface (Weiss 1982). UNIX is as processing and data flow hardware.
yet not suitable for real-time systems,
database systems handling large volumes of
on-line transactions, or for non-programmers SOFTWARE FOR PROCESS
(although a PASCAL/UNIX interface is now SYSTEMS ENGINEERING
available).
We have discussed the host language for pro-
cess systems engineering (PASCAL/ADA) and
HARDWARE FOR PROCESS SYSTEMS the host operating system (UNIX prototype).
ENGINEERING We have made a case for data-independent
programming through the use of the engineer-
From the foregoing, it is obvious that we ing DBMS. It now remains to anticipate the
expect the computer technology of the future of application software in process
future to be distributed in architecture systems engineering.
with a great many opportunities for custom-
izing to the application. The late 1980's Process design software of the future will
will signal the dawn of the VLSI era, be highly modular, data-independent,
supported by computer aided design. In- strongly structured for maintainability in
stead of merely replicating the von Neumann the modern software engineering sense, and
structures of the past in cheaper and faster supported on both personal workstations
circuit elements, we will see new boundaries and larger host machines through network-
between hardware and software. Current ing. This software will be integrated into
hardware technology is simply the lowest a hardware-software complex making liberal
common denominator of machines with general use of building-block architecture and con-
purpose application. New special purpose taining some VLSI subsystems in silicon
hardware will be plug-compatible with (ROM) for such activities as operating
traditional computers, software-in-silicon system kernel, graphics engines, compiler
operating systems, intelligent terminals, engines, relational database engines, and
microcomputers, graphics systems, printers possibly intermediate level operations of
and peripheral memory devices. chemical process design in ROM.

Design reports will be prepared from en- As examples of the latter we propose the
gineering data bases edited on text-or word- possibility that physical property opera-
processing machines, typeset if necessary tions, and two-and three-phase determi-
and transmitted over broadband telecommuni- nation routines would be available in ROM.
cation systems. The communication systems Another silicon engine might handle all
of the world will consist of local and scalar-vector, vector-vector, and vector-
global connections, combining voice, data, matrix operations using pipelined (systolic)
message and video modes. Local networks array processor technology. At this point
will support a great array of printing it is useful to consider whether complex
modules, distributed computing power, data process design procedures would be coded in
storage modules, etc. New and cheaper a conventional procedural language or in an
large scale data storage media such as interpretive command language like the UNIX
videodiscs will quickly assimilate whole shell language. One can at least surmise
libraries of instructional materials, re- that a command language layer will be the
ports, and design data. Access to the net- standard user interface. Graphical inter-
work will take place at home, factory, or faces will be used interchangeably with
office. Local networks will be ported to command languages. Beneath the user inter-
national and international networks via face one could have a hybrid of procedural
television cable and space satellites. routines and interpreted, non-procedural,
Not only will the computing power be dis- problem-oriented languages with the inter-
tributed but the notion of an industrial preter residing in ROM. With data-
 

490 R L MOTARD

independent programming any mix of old and EXPERT SYSTEMS


new process design modules can be assembled
into a complex program using shell language Having established a viewpoint about com-
pipelines and scripts. Alternatively, as puter technology developments as an aid to
is the option with DBMS, the user might wish process engineers, it now remains to
to carry on a piecemeal interaction with stretch our conception to the limit. Where
process models making use of the high does the interface between man and machine
modularity of the environment. fall in the future? We are not so concern-
ed about ergonomics and user friendliness
The architecture of process flowsheeting in this instance but about the computer
systems will need to be redefined to fit system as a "knowledge" base. At what
the computer technology of the future. In point does the software transcend the
one respect at least, flowsheeting packages analytical or algorithmic boundary and be-
will be more simple to construct. They come a synthetic problem solver, a true
will no longer be built with internal data tool for the human mind?
storage manipulation. MIT-ASPEN, which will
become the U.S. industrial standard in the Those of us who have studied process
near future, will be the last generation of synthesis in its various parts have some
stand-alone flowsheeting systems. The final experience with knowledge bases. However,
generation will be data-based. In another our most important successes, as Allen
respect one will be considering new program Newell (1981) has emphasized, have been in
architectures, divorced from data manage- problem areas that yield to a uniformity
ment, which lend themselves to different of representation such as heat exchanger
forms of partitioning to harness the power networks and unintegrated distillation
of innovative or non-von Neumann hardware sequences for sharp separations. Sub-
structures. Partitions which support stantial progress has also been made in
parallel processing and data flow process- certain applications of computer search
ing will evolve. methods to problems in organic synthesis.
Again a uniform representation is already
We have traditionally been burdened with available in the classical structural model
the dichotomy between equation-oriented and of organic chemical molecules. Other areas
simultaneous modular flowsheeting systems, of chemical process knowledge may ultimate-
with a spectrum of alternatives in between. ly yield to uniform representation and
Such distinctions will begin to blur in the thereby uniform procedures but, by and
new technology. A hierarchy of approaches large, the most important knowledge namely,
will be executable on the same process problem-specific knowledge, is non-uniform.
models. At one level, the user will be One alternative (Stephanopoulos, Linnhoff
able to use the simplest kind of sequential and Sophos 1982) is to decompose the
models for process synthesis and screening problem into uniform domains as has recent-
of alternative flowsheets. At another ly been proposed for energy integrated
level the model might be used as input to sharp distillation sequences. The best
a hybrid equation-solving approach, con- unintegrated sequences turn out to be the
taining both equations and conventional best candidates for integration; a happy
discrete process modules. At a third result.
level, simultaneous modular architectures
might be used interchangeably on the same As process engineers we see the benefits
process model to optimize the process of artificial intelligence and expert
according to infeasible path algorithms systems in assisting us to invent chemical
now under development(Biegler and Hughes 19E11). processing structures. Once a complete
structure is provided by the initial
Until data management is fully developed structuring heuristics we are confident of
the DBMS environment will have to accommo- our ability to analyze the design, to
date hybrid environments where substantial optimize it, and to identify its weaker
data banks of evaluated experimental characteristics. From this knowledge one
thermopbvsical data in traditional file can then improve the structure, using
and record form coexist with relational evolutionary rules. One school of thought
data bases. Data preprocessing will pro- prefers to keep the rules simple for train-
vide for a rational preparation of the pro- ing purposes and for easy application using
cess model with respect to physical data pencil and paper. However, there is ample
in a natural manner, generating VLE corre- scope for improving the simple logic of
lations, etc. The same can be said of expert procedures using computer-based
other forms of tabular data normally ex- expert systems. In a computer environment
tracted from handbooks and design manuals one can provide access to a broader know-
ultimately residing on laser optical video- ledge base of physical and chemical be-
discs, with anticipated recording densities havior of chemical species and their mix-
of 3 gigabytes per side, or 750,000 pages tures. The challenge for chemical en-
of text on one side of a randomly accessible gineers is to encode chemical process
optical disc (Goldstein 1982). knowledge in a form that is suitable for
decision making. The difficulty is that
in integrated networks of chemical process-
ing units decisions have both a local and
 

Computer technology in process systems engineering 49

a global impact. So, as the application of Goldstein, Charles M. (1982). Optical Disk
expert programming matures we will see a Technology and Information. Science, 215
growing emphasis on adaptive learning pro- 862-868.
cedures drawn from the field of artificial
intelligence. Howden, William E. (1982). Contemporary
Software Development Environment. comm.
Much has been made of the ability of sym- ACM 25_ 318-329.
bolic manipulation systems
in artificial in-
telligence, or interactiveproblem solving. Kernighan, Brian W. and Dennis M. Ritchie
In our field I see this as only a super- (1978). The C Programming Language,
ficial veneer since so much of our knowledge PrenticeLHall, Englewood Cliffs, New
base requires intensive computation to ex- Jersey.
tract meaningful information. Every sensi-
tive variable must be evaluated in the con- Kernighan, Brian W. and Samuel P. Morgan
text of mixture-dependent properties, (1982). The UNIX Operating System: A
every reaction phenomenon is environmen- Model for Software Design. Science, 215,
tally dependent. The expert systems of 779-783.
chemical engineering will have to be support-
ed by substantial computation and pure list LaBrecque, Mort (1982). Faster Switches,
processing languages alone will not do. Smaller Wires, Larger Chips. MOSAIC,
Nevertheless, the evolution of relational Jan/Feb, National Science Foundation,
data management systems, of VLSI engines Washington, D.C. pp. 26-32.
tuned to the relational and computational
algebra of chemical processing environments, Lettieri, Larry (1982). Software-in-
and of multi-mode flowsheeting systems will Silicon Boosts System Performance, Cuts
raise exciting opportunities for configur- Programming Time. Mini-Micro Systems,
ing new processing complexes with computer March, 93-95.
aids.
Newell, Allan (1981). How to View the
Progress in problem solving by computer may Computer. In R. S. H. Mah and W. D. Seider
come from an unexpected source. The pro- (Eds.1, Foundations of Computer Aided
liferation of recreational software in video Process Design, Vol.l, American Institute
and computer games (Bimbaum 1982) may spill of Chemical Engineers, New York. pp. l-25.
over into the computer aided design sector.
These AI-based procedures already incor- Perris, F. A. (1981). Imperial Chemical
porate limited natural language under- Industries, Ltd. Personal Communication.
standing, common-sense knowledge bases, and
non-trivial simulations of human reasoning. Stephanopoulos, George, B. Linnhoff and
A. Sophos (1982). Synthesis of Heat In-
tegrated Distillation Sequences. Under-
REFEREN ES standing Process Integration, The
Institution of Chemical Engineers Symp.
Bacon, Glenn (1982). Software. Science, Ser. No. 74, London. pp. 111-130.
215, 775-779.
Weiss, Harvey M. (1982). INGPES: A Data-
Biegler, L. T. and R. R. Hughes, 1981. Management System for Minis. Mini-Micro
Infeasible Path Optimization with Systems, January, 231-237. --
Sequential Modular Simulators. American
Institute of Chemical Engineers Meeting,
New Orleans, Louisiana, November, Paper
50a.

Bimbaum, Joel S. (1982). Computers: A


Survey of Trends and Limitations. Science,
215 760-765.

Branscomb, Lewis M. (1982). Electronics


and Computers: An Overview. Science, 215
755-760.

Cherry, D. H., J. C. Grogan, G. L. Knapp,


and F. A. Perris (1982). Use of Data Bases
in Engineering Design. Chem.
--- Eng. Progr.,
78, 59-67.

codd, E. F. (1982). Relational Database:


A Practical Foundation for Productivity.
Comm ACM, 25, 109-117.
--

You might also like