You are on page 1of 14

Int. J. Management and Decision Making, Vol. 11, No.

1, 2010

55

An AHP-QFD approach to developing DSS for crisis


management
Magdy M. Kabeil
Department of Computer and Information Systems,
Sadat Academy for Management Sciences,
P.O. Box 5742 Heliopolis West, Cairo, Egypt
E-mail: mkabeil@gmail.com
Abstract: The paper defines and demonstrates the use of a framework for
developing a DSS for crisis management. The framework is based on the
analytical hierarchy process (AHP) and the quality function deployment (QFD)
technique. The AHP is used for defining and assigning relative weights for a
comprehensive list of requirements that covers all expected crises of a crisis
management centre. The QFD is used for defining and assigning relative
weights for design components that support the defined requirements. An
example of applying the framework for developing a conceptual design of DSS
in a crisis management centre is demonstrated.
Keywords: strategic decision making; crisis management; decision support
systems; DSS; analytical hierarchy process; AHP; quality function deployment;
QFD.
Reference to this paper should be made as follows: Kabeil, M.M. (2010)
An AHP-QFD approach to developing DSS for crisis management, Int. J.
Management and Decision Making, Vol. 11, No. 1, pp.5568.
Biographical notes: M.M. Kabeil is a Professor of Information Systems at
Sadat Academy for Management Sciences, Egypt, a US Fulbrighter to UC
Berkeley 19951996, and a former Director of the National Operations
Research Centre of Egypt. He holds a PhD in Information Systems and Quality
Assurance from the College of Engineering, Ain Shams University, Egypt, and
an MS in Operations Research and Systems Analysis from the Air Force
Institute of Technology, Dayton OH, USA. He teaches undergraduate and
graduate courses in DSS, modelling and simulation, system dynamics, and
business process reengineering. In addition to two monographs and five
chapters in books, he has published over 30 scientific papers in professional
peer-reviewed journals and proceedings. His research interests include DSS,
crisis management, and IT planning, education and diffusion in developing
countries.

Introduction

Decision support systems (DSS) for crisis management are more complex than regular
DSS. In crisis management DSS the solution sets are dynamic and reflect the changing
nature of domains in crisis situations (Toma, 2004). Information requirements of such
DSS are different from regular DSS in several aspects:

Copyright 2010 Inderscience Enterprises Ltd.

56

M.M. Kabeil

data comes from different sources in different formats with different levels of
timeliness

the content of a data set is usually encountered in a mass of similar material relating
to a variety of both relevant and irrelevant subjects

critical data items need efficient methods of filtering, validating, referencing,


cataloging, storing, and updating in limited time frame

significant items of data when separated from other material surrounding it are often
found to be fragmentary and incomplete

detection of an important data item is usually followed by an intensive search for


further complementary material

much of the information is non-quantitative in nature and needs special techniques to


incorporate it in the decision structure

information is frequently highly subjective and consists of opinions and assessments


rather than factual data

information interpretation is often inseparable from that of acquisition of information

much of the work of information processing is concentrated on the search for clues
from which assessment of present and potential future environmental conditions can
be made (Radford, 1978).

The paper defines and demonstrates the use of a framework for developing a DSS for
crisis management. The framework is based on the analytical hierarchy process (AHP)
and the quality function deployment (QFD) technique. The paper is structured in five
sections. After this introduction is discussed the quality capacity of the decision making
process, which serves as a guide for addressing relevant components of the design. Next,
the proposed framework and a proof of concept (March and Smith, 1995) are presented.
In Section 4, an example of using the framework is developed. Finally, the paper ends up
with the conclusion section.

Quality capacity of the decision making process

The generic decision cycle in crisis management context starts with data gathering. The
data is processed further to higher levels of information, knowledge, intelligence,
wisdom, and decision (Figure 1). The decision is implemented through a
command, control, communication, computer, and intelligence (C4I) system to move a
real-world-situation to the most appropriate position for the next decision or action
(Harris, 2008).
Quality of decision is a rating of whether a decision reflects the preferences of the
decision makers and meets the stated objectives effectively, including byproducts, and
efficiently, with minimum side effects. A process is a set of successive operations
targeting a specific result. Quality capacity of a process is the capability of the process to
produce a specific level of quality (Juran, 1992).

An AHP-QFD approach to developing DSS for crisis management


Figure 1

57

Levels of processing in DSS (see online version for colours)

The quality of decision is built from the very beginning all through the decision cycle
(Herek et al., 1987; Keren and De-Bruin, 2004). The expected quality of a decision can
be estimated by investigating the quality capacity of the decision process (Davern, et al,
2008; Herek et al, 1987).
According to Simon (1977), the decision making process consists of three main
phases, intelligence, design and choice. In the intelligence phase, the environment is
searched for conditions requiring a decision and information gathered with respect to
those conditions. In the design phase, the available courses of action are determined and
analysed to determine their relative values as solutions to the decision problems. In the
choice phase, an available course of action is selected that is designed to convert the
present, less-desirable situation into a future situation believed to be more desirable.
Radford (1978) believes that an effective formulation of procedures for dealing with
strategic decision problems requires a modification of Simons model that takes into
account the interaction between participants, which is a major feature of complex
decision problems such as in crisis management.
In this paper, the quality capacity of the decision making process in the context of
crisis management is considered in four phases which are intelligence, design, choice,
and integration.
The quality capacity of the intelligence phase is assessed using the data quality
management framework proposed by Shankaranarayanan and Cai (2006). The approach
is based on the notion of managing information as a product (Wang et al., 1998). The
quality capacity of the design phase is assessed through the evaluation of the analytical
tools and models used in a set of expected crises.
The quality capacity of the choice phase is assessed from several perspectives. Most
psychologists and political science scholars (Janis and Mann, 1977) believe that the main
reason of defective decision making is groupthink. Groupthink is a type of thought
exhibited by group members who try to minimise conflict and reach consensus without
critically testing, analysing, and evaluating ideas (Baron 2005). Janis (1982) listed seven
symptoms of the defective decision-making process which groupthink can produce:

58

M.M. Kabeil

incomplete survey of alternatives

incomplete survey of objectives

failure to examine risks of the preferred choice

failure to reappraise initially rejected alternatives

poor information search

selective bias in processing available information

failure to develop contingency plans.

The end result of this causal chain is a greater probability that a poor quality decision will
be made. On the other side, Janis (1982) devised seven ways of preventing groupthink:
1

leaders should assign each member the role of critical evaluator

higher-ups should not express an opinion when assigning a task to a group

several independent groups should be formed for working on the same problem

all effective alternatives should be examined

each member should discuss the groups ideas with trusted people outside of the
group

the group should invite outside experts into meetings for discussion and questioning

at least one group member should be assigned the role of Devils advocate.

Deploying tools and techniques for handling group thinking will improve significantly
the quality capacity of the choice phase.
The quality capacity of the integration phase is assessed through testing the
communication and collaboration protocols. Example of building quality capacity in the
integration phase is the National Response Plan (NRP) in the USA (DHS, 2006). The
plan is intended to integrate public and private response by providing a common
language and outlining a chain-of-command when multiple parties are mobilised. The
NRP is a companion to the National Incidence Management System that acts as a more
general template for incident management regardless of cause, size, or complexity (DHS,
2006). Common alerting protocol (CAP) is a relatively recent mechanism that facilitates
crisis communication across different mediums and systems. CAP helps create a
consistent emergency alert format to reach geographically and linguistically diverse
audiences through both audio and visual mediums.
Akao (1990) combines quality assurance concepts with value engineering concepts in
the quality function deployment (QFD) technique, which is used in the next sections for
transforming crisis requirements into design components.

Proposed framework for developing a DSS for crisis management

The proposed framework is based on the analytical hierarchy process (AHP) and the
quality function deployment (QFD) technique. The AHP is used for defining and
assigning relative weights for a comprehensive list of requirements that covers all

An AHP-QFD approach to developing DSS for crisis management

59

expected crises of the case study. The QFD is used for defining and assigning relative
weights for design components that support all requirements of the expected crises.

3.1 The analytical hierarchy process (AHP)


The AHP is a mathematically rigorous, proven process for prioritisation using pairwise
comparison judgments (Saaty, 1982). Based on the pairwise comparison, a composite
weight and relative priority are calculated for each item. The AHP is used in this paper
for structuring the relations and weighting the relative importance of the union of
requirement items for all expected crises.
Figure 2

Relative weights of requirements

First, a list of all expected crises is defined (Cj, j=1, n) and for each one are assigned a
probability of occurrence (Pj, j=1,n) and a marginal loss-success value (Vj, j=1,n). The
crisis importance rating (CIRj = Pj * Vj ) allows us to prioritise the expected crises.
Delphi technique is used to collect and integrate these data.
The main deliverable of this step is a list of expected crises (1), along with the crisis
importance rating (2):
Cj, j = 1,.., n

(1)

CIRj = Pj * Vj, j = 1,.., n

(2)

Second, the expected scenario and associated requirements (CRjk, k=1, m) are developed
for each one of the anticipated crises (j=1, n) based on historical data and experts
assessment. The experts assessment is acquired through semi structured interviews and
brainstorming. The AHP is used for developing the crisis requirement importance rating
(CRIRjk, j = 1, n and k= 1, m) through pairwise comparison. The CRIR allows us to
prioritise the requirements of each one of the expected crises.

60

M.M. Kabeil

The main deliverables of this step are lists of requirements (CR) along with the crisis
requirement importance rating (CRIR) for each one of the expected crises.
CRjk, j = 1, n, k=1, m

(3)

CRIRjk, j = 1, n and k= 1, m

(4)

Third, all sets of requirements for all expected crises are integrated into one single set of
integrated requirements (IR). The summations of all (CRIR * CIR) for the same
requirement becomes the integrated requirement importance rating (IRIR) according to
the following equation:
IRIRi = i (CIRj * CRIRjk) for each integrated requirement (i)

(5)

These final IRIR are normalised as depicted in Figure 2 before being combined into the
QFD matrix.

3.2 Quality function deployment (QFD)


The set of integrated requirements (IR) with integrated requirement importance rating
(IRIR) is transferred into components of DSS design using the QFD. The QFD is an
overall concept that correlates design quality with user requirements (Sankaran et al.,
2008; Slinger, 1992). It has been used in several domains including decision support and
crisis management (Vlad et al., 2006; Kara-Zaitri and Al-Daihan, 1999). The technique
uses matrices to collect, analyse and manage goals towards a final product. However,
95% of the studies applying QFD so far use only the first matrix, called house of quality
(HoQ) (Chan and Wu, 2002). Basically, such matrix correlates requirements what to
design components how. The correlations between the integrated requirements and the
design components are specified using symbols of strongly related, moderately
related, weakly related, or not related, which represent the corresponding numerical
values 9, 3, 1, or 0 in the matrix calculations. The internal relationships between design
components are identified in the upper part of the HoQ matrix as depicted in Figure 3.
The row score of design components is the sum of the product between the relative
weights of the integrated requirements importance rating (IRIR) and the correlations
(between requirements and components). The relative % is the percentage of the total
score that each component contributes to the design (Table 1). The results show that the
design components fulfil the integrated requirement of the DSS.

DSS-1.3

DSS-1.4

DSS-1.5

DSS-1.6

DSS-1.7

DSS-2.1

DSS-2.2

DSS-2.3

DSS-2.4

DSS-2.5

DSS-2.6

7.8

11.4

6.2

6.0

6.3

7.1

6.0

9.1

7.8

4.3

3.9

4.1

4.6

DSS-4

DSS-1.2

Rel.
%

DSS-3

DSS-1.1

Relative weight (%) of the DSS components

Component

Table 1

10.7 4.5

The design is tested against the seven symptoms of the defective decision-making process
of Janis (1982), which are discussed in the previous section. The main deliverable of this
step is a list of design components of the DSS as defined in the next section.

An AHP-QFD approach to developing DSS for crisis management


Figure 3

HoQ model (see online version for colours)

61

62

M.M. Kabeil

Example of applying the framework for developing DSS

The resulting design of the DSS for crisis management consists of four main units, which
are intelligence unit (DSS-1), design unit (DSS-2), choice unit (DSS-3), and integration
and display unit (DSS-4). The four units in turn include 15 components (Kabeil, 2009) as
depicted in Figure 4 and as illustrated bellow.
Figure 4

Main components of DSS for crisis management (see online version for colours)

An AHP-QFD approach to developing DSS for crisis management

63

DSS-1 Intelligence unit


Information relevant to the decision situation is gathered and maintained by the
intelligence unit all through the crisis life cycle. The intelligence unit consists of seven
components.
DSS-1.1 Monitoring component: The monitoring component supports the production,
dissemination, and display of crisis theatre data. It consists of nodes that can
create, receive, edit, transmit, and store video clips as well as images, graphics,
voice, and text data.
DSS-1.2 Database component: Crisis management database is an integrated collection of
DSS core data. The database structure corresponds to the needs of the expected
crises and allows access by multiple users and use by several applications of the
centre.
DSS-1.3 Planned communication component: This component is responsible for building
and managing a network of distributed databases belonging to several
organisations and government departments. Principles of electronic data
interchange are applied to this component (Beckner, 2008; Fowler, et al., 2007).
DSS-1.4 Ad-hoc communication component: This component is responsible for
developing an index of data sources of different subjects and specialties. The
coordination protocols between the DSS and each one of these sources of data
include only types of content, format, response time, security, and error control
of data exchange.
DSS-1.5 Reconnaissance component: This component is responsible for conducting
information gathering activities of strategic or operational significance. The
component complements other intelligence components by obtaining specific,
well-defined and time-sensitive information.
DSS-1.6 Assumptions and forecasting component: A typology of uncertainty is developed
and maintained in this component along with a framework for handling
uncertainty at different stages in the crisis management processes (Refsgaard et
al., 2007; Royes and Bastos, 2006). The responsibility of this component is to
fill gaps of data in the decision structure (Ballou and Pazer, 2003; Fildes et al.,
2006) and to conduct forecasting activities of decision consequences and
warning signals (Hopple et. al., 1984).
DSS-1.7 Validation and reasoning component: Reasoning is the process by which new
information is derived from combinations of existing information. Reasoning
allows crisis management team to rely on information as facts, even though they
have not specifically verified this information in a physical manner (Browne et
al., 1997).
The main quality attributes of the intelligence unit outcomes are timeliness, accuracy,
usability (tailored to the specific needs), completeness, relevancy, objectivity (unbiased,
undistorted, and free from political or other constraints), availability (including
appropriate security classification), and understandability.

64

M.M. Kabeil

DSS-2 Design unit


The main responsibility of this component is to develop alternatives for decision making.
Creativity in developing alternatives is a key factor in crisis decision making.
Alternatives may include one or combinations of technological, industrial, diplomatic,
economic, and/or military operations. Specifications of alternatives and possible courses
of action are identified and clearly formulated. The unit includes the following six
components.
DSS-2.1 Analysis and knowledge management component: The analysis and knowledge
management component coordinates with the database component for
maintaining the DSS data-warehouse (March and Hevner, 2007). Several
techniques and tools are available to support decision-analysts in synthesising
the fragments of information, gleaning the most possible from their content, and
forming a base of knowledge for decision making (DeRouen and Sprecher,
2004). Knowledge in crisis management context includes domain-specific rules,
heuristics, boundaries, constraints, previous outcomes, know-how, and entities
behaviour (Fischer and Mastaglio, 1991; Nevo and Chan, 2007).
DSS-2.2 Modelling and simulation component: The main responsibility of this
component is to develop and maintain the model-base of the DSS, to represent
decision makers preferences, to incorporate qualitative factors in decision
structure, to integrate all models related to the crisis in consideration, and to
develop the complete decision structure (Barkhi, et al., 2005; Kara-Zaitri, 1996).
DSS-2.3 Operations component: Amongst the majority of security forces in a country
there are usually small special task forces with unique ability to conduct military
actions in crisis context. Despite the fact that these task forces belong to
different line of command (such as DoD), the DSS team (including the top
commander of DoD) should be involved transparently in all details of crisis
operations.
DSS-2.4 Social and media component: The social and media component is responsible
for communicating with the community and media at large. The component
coordinates with the government departments component for crisis resilience in
society (Boin and McConnell, 2007; Farnham et al., 2006; Granatt and
Pare-Chamontin, 2006).
DSS-2.5 Political component: In most national crises, one or more foreign countries may
be direct or supportive actors. The political component is responsible for
handling the political issues in crisis context.
DSS-2.6 Government departments component: The government departments component
is responsible for communicating and coordinating with other ministries and
administrations in crisis management according to their specialties.
The main quality attributes of the unit outcomes are complete, innovative, practical,
accurate, usable, secure, and synchronised alternatives. Each alternative is specified in
terms of costs and benefits.

An AHP-QFD approach to developing DSS for crisis management

65

DSS-3 Choice unit


The choice unit works as a single component that is responsible for developing a
preferences structure of the decision makers in a crisis context (Limayem et al., 2006;
Yates, 1990). Once the legitimate stakeholders, problem definition, objectives, and policy
strategies are agreed upon, criteria hierarchy, and preferences structure are developed and
used for ranking the alternatives and measuring the degree to which each objective is met
(Frijda, 2005).
Crisis management decision problems are prime candidates for multiple criteria
decision making (MCDM) techniques identified by Keen (1987). These problems
involve multiple stakeholders, conflict of preferences, ethical choices, and trade-offs
among economic, social, and environmental objectives. Addressing such problems
require communication, team support, and increased emphasis on interactive MCDM
methods.

DSS-4 Integration and display unit


The integration and display unit works as a single component that is responsible for
integrating all components of the DSS and facilitating its team to access transparently the
internal and external components of the system (Huang et al., 2002). The display function
in the unit is concerned with making the information that is pertinent to the current status
available to the team responsible for decisions in suitable format.
The five modes of operation that need different types of interface are subscription,
terminal, clerk, intermediary, and conference modes (Alter, 1980). In the subscription
mode a decision-maker receives scheduled specific reports that are generated and
submitted automatically by the system. In the terminal mode a decision-maker interacts
directly with the DSS in an online manner for a specific inquiry. In the clerk mode, a
decision-maker is using the system indirectly via input coding forms or other electronic
batch submission processes. In the intermediary mode, a decision-maker interacts with
the system through one or more intermediary users.
In the conference mode, the decision-makers interact comprehensively with the
system as a group decision making process, which is the main mode of operation in the
DSS. The conference room is a display room similar to military command centres. Three
sets of screens are used in the conference mode for displaying data to the crisis
management team, which are crisis theatre screens, alternatives design screens, and
choice criteria screens. Their purpose is to bring required strategic information, as well as
relevant information from the internal system, to the notice of the decision makers in as
comprehensive and expressive way as possible within the time limits.
The technical architecture of the unit is supported by a network of integrated work
stations, file servers, communication links, and encryption devices that insure the
survivability, interoperability, security, compatibility, and capacity of the system (Goss,
2006; Graves, 2004). The DSS expertise, DSS design, and IT infrastructure affect the
DSS use behaviour and performance (Lee et al., 2008; Moreau, 2006; Benamati and
Lederer, 2008; Adkins et al., 2003; Austin, 1989; Yuan and Detlor, 2005).

66

M.M. Kabeil

Conclusions

A framework for developing a DSS for crisis management is defined. The framework is
based on the analytical hierarchy process (AHP) and the quality function deployment
(QFD) technique. The AHP is used for defining and assigning relative weights for a
comprehensive list of requirements that covers all expected crises of a crisis management
centre. The QFD is used for defining and assigning relative weights for design
components that support the defined requirements.
This framework should be used as a guide that allows developing a realistic system
capable of meeting most expected scenarios. It can be a basic foundation for the
development of flexible, accurate and reusable DSS for crisis management.
An example of applying the framework for developing a case of DSS for crisis
management is demonstrated. By combining the proposed framework with an illustrative
example, the study has provided an evidence of framework validity.

References
Adkins, M., Burgoon, M. and Nunamaker, J.F. (2003) Using group support systems for strategic
planning with the United States Air Force, Decision Support Systems, Vol. 34, No. 3,
pp.315337.
Akao, Y. (1990) Quality Function Deployment: Integrating Customer Requirements into Product
Design, English translation, Productivity Press, New York.
Alter, S.L. (1980) Decision Support Systems: Current Practice and Continuing Challenges,
Addison-Wesley, Reading, MA.
Austin, C.L. (1989) National Crisis Management and Technology, National Defense University,
National War College, Fort McNair, Washington, DC 20319.
Ballou, D. and Pazer, H. (2003) Modeling completeness versus consistency tradeoffs in
information decision contexts, IEEE Transactions on Knowledge and Data Engineering,
Vol. 15, No. 1, pp.240243.
Barkhi, R., Rolland, E., Butler, J. and Fan, W. (2005) Decision support system induced guidance
for model formulation and solution, Decision Support Systems, Vol. 40, No. 2, pp.269281.
Baron, R.S. (2005) So right Its wrong: groupthink and the ubiquitous nature of polarized group
decision making, in M.P. Zanna, (Ed.): Advances in Experimental Social Psychology,
pp.219253, Elsevier Academic Press, San Diego.
Beckner, M. (2008) Pro EDI in BizTalk Server 2006 R2: Electronic Document Interchange
Solutions, Apress, Berkeley CA.
Benamati, J. and Lederer, A.L. (2008) Decision support systems infrastructure: the root problems
of the management of changing IT, Decision Support Systems, Vol. 45, No. 4, pp.833844.
Boin, A. and McConnell, A. (2007) Preparing for critical infrastructure breakdowns: the limits of
crisis management and the need for resilience, Journal of Contingencies and Crisis
Management, Vol. 15, No. 1, pp.5059.
Browne, G.J., Curley, S.P. and Benson, P.G. (1997) Evoking information in probability
assessment: knowledge maps and reasoning-based directed questions, Management Science,
Vol. 43, No. 2, pp.114.
Chan, L.K. and Wu, M.L. (2002) Quality function deployment: a literature review, European
Journal of Operational Research, Vol. 143, No. 3, pp.463497.
Davern, M.J., Mantena, R. and Stohr, E.A. (2008) Diagnosing decision quality, Decision Support
Systems, Vol. 45, No. 1, pp.123139.

An AHP-QFD approach to developing DSS for crisis management

67

DeRouen, K. and Sprecher, C. (2004) Initial crisis reaction and poliheuristic theory, Journal of
Conflict Resolution, Vol. 48, No. 1, pp.5668.
DHS (2006) Quick Reference Guide for the National Response Plan, Department of Homeland
Security,
Washington,
DC,
Accessed
31
July
2008
at:
http://www.dhs.gov/xlibrary/assets/NRP_Quick_Reference_Guide_5-22-06.pdf.
Farnham, S., Pedersen, E.R. and Kirkpatrick, R. (2006) Observation of Katrina/Rita groove
deployment: addressing social and communication challenges of ephemeral groups,
Proceedings of the 3rd International ISCRAM Conference, pp.3949.
Fildes, R., Goodwin, P. and Lawrence, M. (2006) The design features of forecasting support
systems and their effectiveness, Decision Support Systems, Vol. 42, No. 1, pp.351361.
Fischer, G. and Mastaglio, T. (1991) A conceptual framework for knowledge-based critic
systems, Decision Support Systems, Vol. 7, No. 4, pp.355378.
Fowler, K.L., Kling, N.D. and Larson, M.D. (2007) Organizational preparedness for coping with a
major crisis or disaster, Business Society, Vol. 46, No. 1, pp.88103.
Frijda, N.H. (2005) Dynamic appraisals: a paper with promises, Behavioral and Brain Sciences,
Vol. 28, No. 2, pp.205206.
Goss, K.C. (2006) Emerging technologies in emergency management, IAEM Bulletin, Vol. 23,
No. 7, pp.1220.
Granatt, M. and Pare-Chamontin, A. (2006) Cooperative structures and critical functions to deliver
resilience within network society, Int. J. of Emergency Management, Vol. 3, No. 1, pp.5257.
Graves, R.J. (2004) Key technologies for emergency response, Proceedings of the 1st
International ISCRAM Conference, Brussels, pp.133138.
Harris, R. (2008) Introduction to decision-making, Virtual Salt, available at:
http://www.virtualsalt.com/crebook5.htm, accessed 17 October 2008.
Herek, G.M., Janis, I.L. and Huth, P. (1987) Decision-making during international crises: is
quality of process related to outcome?, Journal of Conflict Resolution, Vol. 31, No. 2,
pp.203226.
Hopple, G., Andriole, S. and Freedy, A. (1984) National Security Crisis Forecasting and
Management, Westview Press, New York.
Huang, W., Wei, K., Watson, R. and Tan, B. (2002) Supporting virtual team-building with a GSS:
an empirical investigation, Decision Support Systems, Vol. 34, No. 4, pp.359367.
Janis, I. and Mann, L. (1977) Decision making: a psychological analysis of conflict, choice, and
commitment, Free Press, New York.
Janis, L. (1982) Groupthink: Psychological Studies of Policy Decisions and Fiascos, Houghton
Mifflin, Boston.
Juran, A. (1992) Juran on Quality by Design: The New Steps for Planning Quality into Goods and
Services, Free Press, Florence, MA.
Kabeil, M. (2009) A proposed framework for developing a national crisis management
information system, International Journal of Information Systems for Crisis Response and
Management (IJISCRAM), Vol. 1, No. 4, IGI Global, USA.
Kara-Zaitri, C. (1996) Disaster prevention and limitation: state of the art tools and technologies,
Disaster Prevention and Management, Vol. 5, No. 1, pp.3039.
Kara-Zaitri, C. and Al-Daihan, S. (1999) The application of augmented QFD to the evaluation of
emergency plans, the 11th Symposium on QFD, the QFD Institute, available at:
http://www.qfdi.org, accessed 31 July 2008.
Keen, P. (1987) Decision support systems: the next decade, Decision Support Systems, Vol. 3,
No. 3, pp.253265.
Keren, G. and De-Bruin, W. (2004) On the assessment of decision quality: considerations
regarding utility, conflict and accountability, in D. Hardman and L. Macchi (Eds.): Thinking:
Psychological Perspectives on Reasoning, Judgment and Decision Making, John Wiley &
Sons, Chichester UK.

68

M.M. Kabeil

Lee, Z., Wagner, C. and Shin, H.K. (2008) The effect of decision support system expertise on
system use behavior and performance, Information & Management, Vol. 45, No. 6,
pp.349358.
Limayem, M., Banerjee, P. and Ma, L. (2006) Impact of GDSS: opening the black box, Decision
Support Systems, Vol. 42, No. 2, pp.945957.
March, S.T. and Hevner, A.R. (2007) Integrated decision support systems: a data warehousing
perspective, Decision Support Systems, Vol. 43, No. 3, pp.10311043.
March, S.T. and Smith, G.F. (1995) Design and natural science research on information
technology, Decision Support Systems, Vol. 15, No. 4, pp.251266.
Moreau, E.M. (2006) The impact of intelligent decision support systems on intellectual task
success: an empirical investigation, Decision Support Systems, Vol. 42, No. 2, pp.593607.
Nevo, D. and Chan, Y.E. (2007) A Delphi study of knowledge management systems: scope and
requirements, Information & Management, Vol. 44, No. 6, pp.583597.
QFD-Online, available at: http://www.QFDOnline.com, accessed 31 July 2008.
Radford, K.J. (1978) Information Systems for Strategic Decisions, Reston Publishing Company,
Reston VA.
Refsgaard, J.C., Van Der Sluijs, J.P., Hjberg, A.L. and Vanrolleghem, P.A. (2007) Uncertainty in
the environmental modeling process - a framework and guidance, Environmental Modeling &
Software, Vol. 22, No. 11, pp.15431556.
Royes, G.F. and Bastos, R.C. (2006) Uncertainty analysis in political forecasting, Decision
Support Systems, Vol. 42, No. 1, pp.2535.
Saaty, T. (1982) Decision-Making for Leaders: The Analytical Hierarchy Process for Decisions in
a Complex World, Lifetime Learning Publications, Belmont CA.
Sankaran, R.A., Senthil, V., Devadasan, S.R. and Pramod, V.R. (2008) Design and development of
innovative quality function deployment model, International Journal of Business Innovation
and Research, Vol. 2, No. 2, pp.203222.
Shankaranarayanan, G. and Cai, Y. (2006) Supporting data quality management in
decision-making, Decision Support Systems, Vol. 42, No. 1, pp.302317.
Simon, H. (1977) The New Science of Management Decision, Prentice Hall, Englewood Cliffs NJ.
Slinger, M. (1992) To Practice QFD with Success Requires a New Approach to Product Design,
Kontinuert Forbedring, Copenhagen.
Toma, L. (2004) Decision making in a fast speed world: an early warning system for avoiding
crises, Int. J. of Foresight and Innovation Policy, Vol. 1, Nos. 3/4, pp.218231.
Vlad, R.C., Benyoucef, L. and Vlad, S. (2006) Identification of software specifications through
quality function deployment, automation, quality and testing, robotics, 2006 IEEE
International Conference on Digital Object Identifier, 2528 May 2006, Vol. 2, pp.7479.
Wang, R.Y., Lee, Y.W., Pipino, L.L. and Strong, D.M. (1998) Manage your information as a
product, Sloan Management Review, Vol. 39, No. 4, pp.95105.
Yates, J.F. (1990) Judgment and Decision Making, NJ: Prentice Hall, Englewood Cliffs.
Yuan, Y. and Detlor, B. (2005) Intelligent mobile crisis response systems, Communications of the
ACM, Vol. 48, No. 2, pp.9598.

You might also like