Professional Documents
Culture Documents
SYSTEMS ANALYSIS
Enrico Sciubba
University of Roma I, Italy
Keywords: Optimization of energy systems, Design optimization, Synthesis optimization,
Artificial intelligence, Process Synthesis, Inverse Design, Expert Systems, Intelligent Health
Control.
Contents
1. Introduction
2. Is there a "universal" design paradigm?
3. Application of the Universal Design Procedure to Process Synthesis
4. "Design" and "Optimization"
5. Process Optimization
6. Computer-aided Synthesis and Design tools
7. Application of the Universal Design Procedure to the Design of Components
8. Expert Assistants for Process Diagnostics and Prognostics
9. Conclusions
Related Chapters
Glossary
Bibliography
Biographical Sketch
Summary
This chapter is an introduction to the field of artificial intelligence (AI) applications to the
design and monitoring of energy systems, and serves as a compendium for the related
chapters that follow under this topic. After a brief discussion of the characteristics that make
AI useful for engineering applications, a concise definition of terms and concepts is given.
The presentation style has been tailored to provide readers with a general introduction to AI
topics, without burdening them with excessive formalism. Since our goal is to describe
engineering applications to thermal design, emphasis on the applicative side has been
stressed.
1. Introduction
This chapter describes in detail the general activities connected with the application of a
powerful set of the so-called AI-procedures to the selection, synthesis, design and control of
energy systems. Since the field is very broad, we shall restrict our treatment, and employ only
a sub-set of AI, called Expert Systems (ES), to the above tasks: other tools, like Neural
Networks (NN) and Fuzzy Logic (FL) are treated only marginally. The general principle is to
implement a computer-assisted procedure that possesses, in a form that will be discussed in
detail for each implementation, some of the "intelligence" of the human designer. The process
is in principle quite simple, and it is based on the premise that for each "design" task (the type
and structure of these tasks shall be also discussed in detail) there exists a set of general
guidelines, derived from engineering experience formalized and catalogued in the form of
either design manuals or textbooks or otherwise published and accepted design procedures.
An ES is therefore a computer code that mimics not so much the human reasoning, but rather
the way this reasoning can be (and has been) organized at the present stage of technology. The
first point to argue is clearly that there is indeed one general design protocol for all types of
design problems: this is crucial to our thesis, and is in fact the justification for the search for
AI-based "Design Assistants". Once the existence of such a protocol has been established, it is
a simple matter to show that the two fundamental design tasks encountered by an engineer,
namely the direct (simulation) and inverse (design) problem, can be considered embedded
into a single meta-procedure. The individual chapters under this Topic discuss the application
of this meta-procedure to the synthesis of a process, to the design and/or choice of
components, and to the development of intelligent monitoring and control systems.
Related Chapters
Click Here To View The Related Chapters
Glossary
Abstraction : Every data set can be represented by two levels of specification:a
functional one, which describes what the objects do (their function), and a
practical one, which describes the actual implementation of that function
(the many elemental tasks that the object performs to accomplish its
function). The logical action of separating the functional from the
practical specification level, and of taking only the former to represent the
object, is called abstraction.
Aggregation level : The logical level at which components or units of a process are
described. At aggregation level 0 each single component is treated as a
black box; 1 is the level at which functional groupings are considered, and
so on. There is always a maximum aggregation level at which the entire
process is treated as a single black box.
Algorithm : A finite set of clear and unambiguous elementary instructions that
accomplish a particular task.
Always-relevant : In a design task, all those parameters that are known to be, or are thought
parameters to be, important in the identification of the properties or the operational
behavior of the system to be designed.
Analogy : A particular type of logical link between two objects whose attributes
may not be the same or be in the same "order", but which can be mapped
onto each other at some higher level:this means that there exists at least
one set of rules capable of representing the relationships among
corresponding relevant attributes of both objects.
AND-tree : A decision tree that possesses only AND nodes.
AND/OR tree : A decision tree that includes at least one AND/OR node.
Approximate : Logical deduction based on approximate knowledge, that is, on a
reasoning knowledge base (KB) expressed by non-exact facts. An approximate KB
ought not to be confused with a fuzzy KB: a characteristic of approximate
statements is that they can always somehow (albeit at times in an awkward
format) be expanded to give origin to "exact" expressions.
Artificial : The portion of Computer Science that investigates symbolic and non-
Intelligence algorithmic reasoning, and attempts to represent knowledge in such a form
that it can be employed to generate instances of machine inference.
Attribute : A property of an object. Said to be characteristic if it is the attribute that
establishes whether the object belongs to a class.
Automatic : The implementation of a series of monitoring and control devices
operation of a connected to a central unit that coordinates the flux of information from
plant and to the processes, and manages them according to either a
predetermined plan ("non-intelligent" or rigid control) or to a general
schedule that attempts to meet some general goals ("intelligent" or flexible
control).
Backward : A form of reasoning that attempts to validate (or confirm) a fact (called
chaining (BC) goal) by scanning the knowledge base (KB) to determine whether the
rules it contains univocally generate the logical possibility of the goal. In
practice, this means that if a fact q is asserted by the KB, every rule that
contains q as the THEN predicate is assumed to be true, and therefore p,
the predicate of its IF, is also true.
Backtracking : When a scanning procedure reaches a "dead branch" (a node Nij which
does not possess suitable successors) in the tree it is exploring, it must
resume the search from ("backtrack to") some previous node Nhk. If h = i -
1, k = j (that is, Nhk is Nijs predecessor), the backtracking is called
chronological; otherwise, it is procedure-driven (for instance, breadth
search resumes from Ni,j-1, depth search from Ni-1,j, etc.). Backtracking can
also be directed by a logical-connection list, and in this case it is called
dependency-directed.
Belief : A particular form of approximate knowledge, in which a fact is asserted
not with absolute or quantifiable certainty, but with a vague and non-
quantifiable proposition:"I believe that f is true." Belief can be quantified
by assigning a computable degree of unlikeness to any counter-fact that
would disprove the believed fact.
Blackboard : A particular form of hierarchically organized expert system (ES),
system consisting of a certain number of local ES ("sources"), each handling a
portion of the knowledge base and communicating their conclusions to a
"blackboard," which represents the "latest state of affairs" of the global
inferential activity. A dedicated inference engine controls the flow of
information and resolves contradictions between different sources.
Certainty : An event is said to be certain when the probability of its negation is
unconditionally equal to zero. The majority of engineering data are not
absolutely certain:this can be accounted for by attaching to the
corresponding propositions a certainty factor 0 fc 1, and applying the
rules of approximate reasoning.
Class : A collection of objects, based on some common attribute of its
components, or on some peculiar relationship between them. Notice that
there may be more characteristic attributes and multiple characteristic
relationships.
Clause : The predicate of a logical connective (IF and THEN) in a rule.
Connection : A process-dependent structured data table containing information about
matrix (CM) the connectivity of a process. The CM is in fact a matrix representation of
the connectivity graph of the process.
Connectivity : A graph consisting of points (nodes), each representing an object (or
graph event), connected by lines that represent the interconnections between
pairs of nodes. These interactions may be logical or physical, and may
possess an implicit "direction" (that is, be equivalent to an arrow) or be
direction indifferent. Sometimes, each connection bears a "cost label,"
which identifies the price (monetary, energetic, or functional) that one
incurs if that route is taken.
Constraint : Any restriction posed by technological, environmental, legal, or practical
requirements on the configuration of a system. Constraints may limit the
values of some process parameter, or exclude some system configuration.
Criteria for : A set of attributes of the object to be designed that can be assigned
success quantitative values, so that the probability that a particular design will be
successful can be assessed.
Data : A collection of facts, relationships, and propositions about a portion of
the Universe.
Data type : A collection of objects and operations to be performed on these objects.
Decision tree : A graphical representation of the set of possible choices for a given
situation. The "tree" can be thought of as a single trunk giving origin to
several branches, each branch having twigs, each twig bearing twiglets, ...
until the leaves (or fruits) are reached.
Declarative : Knowledge of facts and relationships expressed in propositional
knowledge form:"fact A is an object B having the relationship p with the object C".
Also called knowing what.
Deduction : A logic procedure that marches forward from the causes to their effects.
Deep knowledge : Substantial knowledge of a problem (which includes a complete
understanding of the premises, a "feel" for the outcome, and a vast
experience of application cases) that implies a qualitative and quantitative
comprehension of the logical chain of reasoning and of the underlying
physical phenomena. An individual possessing deep knowledge about a
field is called an expert in that field.
Degree of : The degree to which a value belongs to a fuzzy set. For example, a level
membership measurement may be 40% positive large and 20% positive medium.
Deterministic : Numerical methods stemming from the quantitative modeling of a
programming portion of the Universe, and the discretization of the model equations.
They usually assume that at least one solution exists for the problem under
consideration, and generate solutions that are strongly model-dependent.
Direct design : A problem that requires the engineer to find the "best" solution (usually,
problem highest possible efficiency and/or minimum production cost) to attain a
specified design objective (the required output) for a given process
configuration.
Domain expert : An expert in a specific field related to the domain of an expert system
(ES), whose knowledge must be imparted to the ES.
Embedding : "Containing", in a logical sense. A procedure p is said to embed another
procedure q if every time p is performed, q is too, but not vice-versa.
Notice that p is necessarily at a higher logical level than q.
Encapsulation : Concealing the details of the implementation of an object. For example,
a black box approach encapsulates the actual internal structure of the
component it represents, and projects to the outside user only the function
performed by that component.
Facts : Expressions that describe particular situations. The expressions can be
logical (propositions), symbolic ("p = q"), numerical, or probabilistic.
Feasibility study : One of the subtasks of any design activity. It consists of creating a
preliminary concept configuration for the process, approximately sizing
the main equipment, and executing a preliminary technological and cost
analysis.
Feedback : In the context of this Theme, this word has two meanings. A control
system is said to possess a feedback mechanism when its output (that is,
the controlling action) depends on the value assumed by the controlled
quantity downstream of the control. A process is said to have feedback
fluxes ifseen from the point of view of its main productthe process
diagram is not linear, because certain components receive inflows from
downstream (the P&I displays some "feedback loops"). See also "logical
loop."
Forward chaining : A form of reasoning that attempts to deduce a hitherto "unknown" fact
(FC) (called goal) by scanning the knowledge base (KB) to determine whether
the data it contains may generate the logical possibility of the goal. In
practice, if a fact p asserted by the KB is the "IF"-predicate of a rule, then
we can infer that q, the "THEN"-predicate of that rule, is also true.
Frame : A collection of semantic network nodes and links. A frame is an object
denoted by its name, endowed with some slots, each slot having more
facets: slots and facets may store values, attributes, or relations.
Fuzzy knowledge : A synonym for vague knowledge. Vagueness may be introduced into a
problem by incomplete or incorrect data, incorrectly expressed
relationships, or the use of an inappropriate model.)
Fuzzy search : A tree-scanning procedure in which the searching criterion is "fuzzified."
Fuzzification : The process of decomposing a correlated set of knowledge chunks
(usually available in an uncertain or vague fashion) into one or more
qualitative groups called fuzzy sets.
Fuzzy set : A set admitting a non-binary degree of membership. In normal ("crisp")
set theory, an object either is or is not a member of a certain set. In fuzzy
set theory (FST), a degree of membership 0 < dm < 1 can be attached to
each object to express the likelihood of its belonging to a given set: dm is
clearly a particular kind of relationship between a set and its objects.
"Normal" language expressions ("very likely," "highly improbable,"
"almost certain," "more likely than") can be easily quantified by a fuzzy
approach. Therefore, FST is well suited to handling vague knowledge.
Heuristics : A set of rules, often approximate and vague, dictated by intuition,
experience, and judgment. Can also be seen as a consistent but incomplete
set of propositions about a chunk of specific knowledge, for which an
exact description is either unavailable or impossible. Proper application of
heuristics, especially in the early stages of a search process, drastically
limits the solution space.
Hierarchical : Top-down procedure by which a design plan is implemented in
refinement (HR) descending levels of abstraction: first the general plan layout, then a
detailed preliminary plan, then a general activity schedule, then a detailed
activity schedule, and so on. HR is, though, a more general concept that
has useful applications in several fields, for example in process synthesis.
Ill-posed : A problem that is not well posed. Also called Ill-structured.
Induction : A logic procedure that marches backwards from the effects to their
causes.
Inference : The derivation of conclusions from premises. Logically expressed in
propositional calculus by "IF [(a AND b) OR c] THEN d" rules.
Inference Engine : A consistent and ordered set of rules embodied in a shell-like program
(IE) that establishes controls and instantiates the problem-solving strategy of
an expert system.
Instance of a class : A specific object of the class, identified by a quantitative or qualitative
value assigned to one of its characteristic attributes.
Inverse design : A problem that requires the engineer to design (i.e., to define and
problem compute the physical configuration of) a device or a plant given the design
objective (the required output) and an objective function (usually,
efficiency or unit production cost) .
Knowledge : A qualitative and/or quantitative description of a specific portion of the
Universe, in the form of a collection of data (i.e., of certain sets of facts,
relationships, rules, forecasts, estimates, and numerical expressions).
Knowledge : The collection and ordering of knowledge. Requires some knowledge at
acquisition a higher level than the problem being considered (meta-knowledge).
Knowledge base
: All of the knowledge collected by the knowledge acquisition activity.
(KB)
Knowledge : The subdivision of the knowledge base into elemental information bits,
decomposition that is, into the largest possible set of the smallest possible data. KD forms
(KD) the conceptual basis of semantic networks.
Knowledge : A person dedicated to the eliciting, gathering, and collecting of
engineer knowledge (typically from domain experts), and to its organization into a
knowledge base.
Knowledge : Organization of the "raw" knowledge base ("as collected" knowledge)
representation into a coherent, comprehensive, and workable knowledge base.
Logical loop : A chain of formal propositions p q r z whose last term z is
the premise for the first one p. To avoid being a tautology, a logical loop
must be implemented on instances of propositions, that is, on facts. In this
case, it means that there is a feedback mechanism, driven by the output of
the last fact (component, flux) z, which influences the input of the first
fact (component, flux) p.
Logical system : A system consisting of symbols ("signs," "words") and of a complete and
coherent set of rules ("grammar," "syntax," "semantic") that describe the
correct use of the symbols.
Memory : The ability of an expert system to "remember" facts not originally
contained in its knowledge base, that were constructed, deduced, or
induced in the course of the inference process.
Macro : The process of collecting, under a single expert system, solution
structuring procedures for different problems. Usually, the macrostructure consists of
a macro-IE (inference engine) that drives several specific IE, one for each
problem (as in blackboard systems, for example).
Membership : The curve attached to a fuzzy set, which maps an input (or output) value
function onto a corresponding value for the degree of membership.
Meta structuring : The process of devising a single higher-level procedure capable of
solving distinct, but logically similar, problems. The meta-procedure is
likely to be simpler than each of its instances.
Modular : A process simulator that is not dependent on the structure of the process,
simulator but can be applied to whatever structure one can build using the (modular)
components and fluxes contained in its library.
Neural network : A set of nodes (neurons) arranged in layers (conventionally, from left to
right), where each neuron of each layer is connected to all of the neurons
of the previous ("upper") and following ("lower") layer, but not to the
neurons of its own layer. Neurons communicate messages to each other,
and the firing of a message depends on the state of the neuron at the
particular time of firing. If the leftmost level experiences some
environmental change (it is given some input), the information propagates
until it reaches the rightmost level:the modification of the states of this last
layer, taken globally, represents the answer of the neural network to the
initial stimulus.
Objective : The formalization (not necessarily in numerical form) of the criteria for
function optimality of a solution.
Object : A computational structure that contains data, data structures, and related
procedures. This means that an object can operate on itself, because it
contains both the data and the operational information about how to
process them.
Optimization : The mathematical procedure that leads to the quantitative minimization
or maximization of a certain expression of the state parameters of the
process under consideration, called the procedures objective function. The
set containing the values of the state parameters that extremize the
objective functions are called the optimal solution set. In qualitative terms,
optimization is the construction of the "best possible" solution to a given
problem.
OR-tree : A decision tree that possesses only OR nodes.
Paradigm : An underlying principle of organization that constitutes the conceptual
basis for a programming language or a solution procedure.
Possibly-relevant : State parameters of a process that are not essential to its description, but
parameters may become essential as a consequence of external circumstances
(imposition of constraints, activation of a correlation with otherrelevant
parameters, and so on.)
Predicate calculus : A pseudo-language consisting of logical predicates (facts describing
objects and their relationships) joined by a small and fixed set of
connectives (AND, OR, IF...THEN, EQUAL TO, LARGER THAN, FOR
ALL...THEN) that establish mutual correlations between predicates. The
only form discussed in this article is the first order predicate calculus, in
which objects may assume quantitative values, but predicates cannot.
Primitive : The form in which an engineering problem is first formulated. It is
problem usually ill structured (that is, vague, incomplete, with a non-precisely
formulated objective function, and with unpredictable knowledge of its
solution).
Probability of : Qualitative or quantitative likeliness that a product is produced within
success the schedule and budget, meets the objectives, and corresponds to the
criteria for success preset for it.
Procedural : Knowledge of the inference procedure to apply to a certain data set to
knowledge reach certain conclusions. Also called knowing how.
Procedure : A set of subtasks with a predetermined interconnection, whose correct
execution leads to the fulfillment of a task or project.
Process : A series of transformations that modify the physical state of some
specified amount of matter.
Process simulator : A computational software that deterministically computes the
instantaneous state of a system on the basis of the local properties of the
working media and of the operating conditions of its components.
Propositional : A pseudo-language consisting of logical propositions (assertions about
calculus objects and their relationships) joined by the four fundamental connectives
(AND, OR, IF...THEN, EQUAL TO), which establish mutual correlations
between these propositions. Propositional calculus is not quantifiable:its
goal is to derive logically necessary conclusions from a set of premises.
Property : An attribute of an object.
Prototype : A working instance of a computational code, usually less complex than
the finished product, but having all of its essential features (specifically
the inference engine). Generally, the prototype version has the same
problem-solving depth of the final version, but much less breadth (that is,
it can handle a much smaller solution space).
Qualitative : A form of reasoning used in process modeling. It is based on non-
reasoning quantitative concepts, like analogy, similarity, asymptotical behavior,
interactions, order-of-magnitude values, relationships, structure, and
functionality.
Recursive : An algorithm A that calls on itself during its own execution. Recursion
algorithm can be direct, if the link is of the type "call (A)", or indirect, if the link is
of the type "call (B)", and B contains a "call (A)" statement.
Relational : Non-numerical methods resulting from the qualitative modeling of a
programming portion of the Universe, and the conceptual representation of the context
in which the problem arises. They do not assume a priori that a solution
exists, and generate models for the solutions rather than the solution
themselves.
Relationships : Connections between objects.
Rules : Logical constructs of the form "IF p THEN q", where both p and q are
propositions or instances of propositions.
Search methods : Numerical or qualitative techniques that scan a certain number of
alternatives to find the most desirable one. The search can be performed
on layered structures (decision trees), in which case the search criterion
may be local ("among the successors of a node, proceed to the one with
the most convenient value of a certain state parameter") or global ("choose
the path composed by the nodes and branches such that the overall value
of a certain function of both a state and a path parameter is the most
convenient"). Global searches are obviously more difficult, and special
methods have been devised to limit the number of nodes explored by the
scanning procedure.
Semantic : A set of nodes and their connections:each node represents an object, and
network each connection expresses a formal relationship between objects, chosen
from a list of "allowable" relationships.
Shells : A domain independent framework of rules connected to form a very
general inference engine with specific problem solving capabilities. A
shell is used by implementing its "reasoning" on a set of specific (i.e.,
domain- and problem-oriented) knowledge added modularly to the
knowledge base of the shell.
Symbolic : The application of manipulation rules (syntax) to a set of pre-defined
reasoning (SR) symbols. SR follows the rules of predicate calculus.
Taxonomy : A systematic set of rules used for classification purposes. Given a
database composed of various chunks of knowledge and raw data,
knowing its taxonomy means, for example, being able to partition the
database into distinct but correlated subsets, in each of which a different
inference engine may be applied.
Thermal System : A collection of orderly interconnected devices/components whose goal is
that of performing an energy conversion, usually from chemical and/or
thermal into mechanical and/or electrical and/or thermal.
Thermo- : A second-law based cost optimization technique. The costs (in monetary
Economics units) are computed with the aid of entropic and exergetic considerations.
Weighted search : A search process in which each branch of the decision tree carries a
"weight" or "penalty" function:the objective of the search is to find the
path(s) for which the sum of the weights on the n branches constituting the
path(s) is minimal.
Well-posed : In a strict mathematical sense, a problem is said to be well-posed if it has
one unique solution, and this solution depends continuously on the
relevant data of the problem (boundary and initial conditions, values of the
coefficients, and so on).
Well-structured : A problem that can be described in terms of numerical variables,
possesses a univocally and well-defined objective function, and admits of
an algorithmic routine for its solution.
Bibliography
A.Bejan, G.Tsatsaronis, M.J.Moran (1996): Thermal Design and Optimization, J.Wiley & Sons [A fundamental
work, complete and clearly written, covering the engineering side of the design of Thermal Systems]
M.De Marco et al.(1993): COLOMBO: an expert system for process design of thermal powerplants, ASME-AES
vol.1/10 [The first completely documented Intelligent Process Synthetizer]
R.A.Gaggioli, S. Qian, D.A. Sama (1989): A Common-Sense Second Law Approach for proving Process
Efficiencies, Proc. TAIES 1989, Beijing, International Academic Publishers-Pergamon Press [An invaluable
source of "procedural thinking", useful in the development of Expert procedures]
M.Green (Ed.) (1992): Knowledge Aided Design, Academic Press, 257 p. [A collection of specific topics in
direct AI applications to design. For specialists]
T.Gundersen,L.Nss, (1988) The synthesis of cost optimal Heat Exchanger Networks: an industrial review of the
state of the art, Comp.Chem.Eng. vol.12, no.6 [A much valued reference: though outdated, it is a very useful
review paper on HEN Design techniques]
E.C.Hohmann,F.J.Lockhart (1976): Optimum Heat Exchanger Network synthesis, Proc. AIChE Meet., Atlantic
City [The original work (after Hohmanns Doctoral Thesis) that laid the foundations of modern HEN Design]
A.S.Kott, J.H. May, C.C. Hwang (1989): An autonomous Artificial Designer for thermal energy systems, ASME
Trans., Oct. 1989 [One of the first Process Synthetizers]
B.Linnhoff (1997): Introduction to Pinch Analysis, in "Developments in the Design of Thermal Systems",
R.Boehm (Ed.), Cambridge U.Press [A mature description of Pinch and Supertargeting methods]
M.Maiorano,E.Sciubba: HENEA: an exergy-based Expert System for the synthesis and optimization of heat
exchangers networks, Int.J.Appl. Thermod., v.3, n.2, 2000 [The first ES-based HEN Design procedure]
N.Metropolis et al (1953).: Equation of state calculations by fast computing machines, Jnl. of Chemical Physics,
vol. 21 [The original description of the Simulated Annealing Method]
M.J.Moran (1997): Second Law applications in Thermal System Design, in "Developments in the Design of
Thermal Systems", R.Boehm (Ed.), Cambridge U.Press [Another valuable source of "expert human thinking" in
the Design of Thermal Systems]
A.Newell (1990): Unified theories of cognition, Harvard Univ. Press [An epistemological explanation of the
basis of AI techniques, including (but not limited to) Expert Systems]
L.T.Ngaw (1998): HEN synthesis using HENCALC and Second Law insight, M.S. Thesis, U. Mass. at Lowell
[An extraordinary example of the extremely sophisticated level at which exergy thinking can be used in HEN
Design]
B.Paoletti, E.Sciubba (1997): Artificial Intelligence Applications in the Design of Thermal Systems, in
"Developments in the Design of Thermal Systems", R.Boehm (Ed.), Cambridge U.Press [One of the first
systematic analyses of the field of AI applications to Thermal Design]
G.V.Reklaitis, A. Ravindran, K.M. Ragsdell (1983): et al.: Engineering Optimization, J.Wiley [A sort of
handbook on Engineering Optimization, with many examples of applications]
D.A.Sama (1995): The Use of the Second Law of Thermodynamics in Process Design, JERT, vol. 117, no.9
[Another valuable source of "expert thinking" for Process Engineers]
E.Sciubba (1998): Toward Automatic Process Simulators: -Part II An Expert System for Process Synthesis, J.
Eng. for GT and Power , vol.120, no.1 [Contains a systematic discussion of a second-generation Expert Process
Synthetizer]
R.D.Sriram (1997): Intelligent Systems for Engineering, Springer Verlag [A fundamental reference for AI
practitioners. Many techniques explained in detail, many worked-out examples]
W.F.Stoecker: Design of Thermal Systems, McGraw Hill, 1980 [An old, but still valid textbook. To be used as a
source of "expert advice" for the Design of Thermal Systems]
Biographical Sketch
Enrico Sciubba (born July 11, 1949) is a Professor in the Department of Mechanical and Aeronautical
Engineering of the University of Roma 1 "La Sapienza", in Roma, Italy. He received M.Eng. Degree in
Mechanical Engineering from University of Roma in 1972. After working for two years (1973-75) as a Research
Engineer in the Research & Development Division of BMW, Munich (Germany), he returned to the University
of Roma as a Senior Researcher (1975-1978). He then enrolled in the Graduate School of Mechanical
Engineering, majoring in Thermal and Fluid Sciences, at Rutgers University, Piscataway, NJ, USA, where he
was granted a Ph.D. degree in 1981. He joined the Department of Mechanical Engineering of the Catholic
University of America, in Washington DC, USA, as an Assistant Professor in 1981, and worked there until 1986,
when he returned to the University of Roma 1 first as a Lecturer, then as an Associate and finally Full Professor.
He holds the Chair of Turbomachinery, and lectures on Energy Systems as well, both at the undergraduate and
graduate level. In 1999 Dr. Sciubba was elected a Fellow of the American Society of Mechanical Engineers. In
2000, he received an Honorary Doctoral title from the University Dunarea de Jos of Galati (Romania). His
research is related to CFD of Turbomachinery, to Exergy Analysis, and to Artificial Intelligence applications in
the design of Energy Systems. His publications include more than 40 archival papers, over 150 articles in
international conferences, one book on Turbomachinery (in Italian) and one on Artificial Intelligence (in
English).