Professional Documents
Culture Documents
Performance Indicator
Resource Catalogue
Version 1.2
2006
Contact Information
Team Leader
ICT Investment Frameworks Branch
The Australian Government Information Management Organisation (AGIMO)
Department of Finance and Administration
Minter Ellison Building, 25 National Circuit, Forrest, ACT 2603
Telephone:
61 02 6215 2222
Email : ictinvestmentframework@finance.gov.au
Acknowledgements
AGIMO acknowledges the co-operation, contributions and support of:
-
Page 2 of 103
Page 3 of 103
Page 4 of 103
The document is not intended to be an exhaustive list of the performance indicators required
to manage ICT operations and projects. Government agencies and organisations should use
the resource library as a guide to selecting the indicators that are most relevant to their
requirements. Alternatively, managers may use the library as a source of ideas for developing
specific or unique indicators that better match the needs of their particular projects or
organisations. Project managers may also use the library to explore additional resources
within and external to government to assist them in managing the overall ICT lifecycle.
1. Introduction
1.1. Background The Importance of ICT and its Management
The effective operation of ICT is mission critical for the Australian Government. There are few if
any programs or services that can operate without the constant underpinning of ICT.
ICT represents one of the highest areas of administrative spending for the Government, and one
that constantly requires new funding for the maintenance and replacement of existing systems
and the development and deployment of new systems.
ICT also provides the capability that Government can most cost effectively use to improve service
levels to customers, directly, through increasing the deployment of e-Government applications, or,
indirectly, by providing an increasingly efficient and responsive workplace for Australian
Government employees, service providers and suppliers to the Government.
While the management of ICT in the Government is generally of a high standard, it is recognised
that there is scope for continuous improvement.
As in most comparable public and private sector environments there are increasing moves
towards making the management of ICT highly:
-
precise
predictable; and
professional.
the overarching management and operation of the enterprises existing ICT infrastructure,
systems, and services; and
the planning, development and deployment of new ICT initiatives.
The primary objective of performance management is different for each organisation, but to most
and to government it will generally be to obtain value for money. Consequently, this Performance
Indicator Resource Library has been produced by AGIMO to assist Australian Government
agencies and organisations with effectively using Performance Indicators.
Value comes from the comparison of inputs to outputs or outcomes, such as the cost of
accomplishment and the related value of net benefits realised. The idea is to ensure that the
value of benefits realised exceeds the cost of investment ie to maximise the return on
investment (RoI).
These represent key framing questions for the consideration of Performance Indicators.
Page 1 of 103
Critical Success Factors: A critical success factor (CSF) represents a factor that must
be present if an objective is to be attained. Achieving success and avoiding failure at an
enterprise, business unit or project level depends upon organisations identifying and
assuring compliance with CSFs.
Example: Management commitment and end-user involvement are CSFs for any ICT
development initiative.
The selection of Performance Indicators for enterprise ICT and/or ICT projects should be
grounded in an initial identification of the applicable CSFs.
The indicators required for monitoring and assessing of ICT initiatives and those required to
monitor and assess operational business as usual aspects of ICT are similar in nature but can
often have different data sources, timing imperatives and risk profiles.
In both it is essential to understand that Performance Management and therefore the indicators
that enable the monitoring of the effectiveness of this, need to apply not just to ICT components
(e.g. personnel, systems, infrastructure, etc) but to the business functions that the systems
support (e.g. customer service, accounting, etc).
There are an almost unlimited range of performance indicators available and many more that can
be developed to suit specific projects. Performance indicators are not KPIs until selected and
applied as key to a specific project.
The skill in applying KPIs is in the selection of the optimum number and appropriateness of KPIs.
This maximises the benefit of using them whilst minimising the cost of using them. This optimum
will vary from, project and project management team, to another.
Page 2 of 103
Indicators
Project
Objectives
Impact
Project
Outcomes
Outcomes
Project
Outputs
Outputs
Activities
Deliverables
Resources
Assumptions & Risk
Copyright Hillwatch 2005
The above hierarchical analysis may be done for organisations as a whole or for discrete ICT
projects.
Based upon the analysis it is possible to construct a useful cluster of PIs that can provide insights
into performance at the levels of:
-
Planning
Project Management
ICT Operations
Enterprise ICT Management.
Page 3 of 103
2. Performance Indicators
2.1. Introduction
Performance Indicators (PI):
-
Are metrics or factors that tend to indicate the health, progress and/or success of a
project, process or area of service delivery.
Are process-oriented, but IT driven.
Focus on resources and processes that are most likely to lead to successful outcomes.
Are usually short, focused, relevant, measurable, repeatable and consistent.
Measure critical success factors.
The management of particular situations will often require a combination of more than one of the
above.
Performance indicators should be actionable in the sense that when an indicator reflects a
situation or change that exceeds a pre-agreed tolerance, managerial intervention or corrective
action should be possible.
The key principles associated with the selection, implementation and use of PIs are provided in
section 2.3 below.
2.2. PI Types
All PIs are based upon measurement.
Performance Indicators can be quantitative or qualitative.
They can be precise and measurable to a high degree of mathematical accuracy, or may need
to be based upon expert or collective opinion.
They can be:
-
Binary or Absolute. These are in effect yes or no measures. They are indicators of
whether a desired state is present or not.
Example: Has an ICT strategy been prepared?
These indicators often need to be qualified by other less absolute measures.
Example: In the case of an ICT Strategy these could represent the answers to the
question: How complete and current is the ICT Strategy?
Page 4 of 103
Comparative. These take the situation as it is and measure it to against a relevant and
anticipated state.
Examples include:
Comparison of costs, savings, efficiency gains, etc actual against budget or
plan.
Comparison of systems development progress with pre-approved schedule.
Comparison against industry or sector benchmarks.
Comparison against known results (for the organisation) for a similar period
or event or project.
Examples: Cost of acquisition versus planned cost of acquisition.
Output Indicators:
-
Cost of a specific deliverable or functionality (e.g., cost of a PC, cost of annual support
per PC, cost per annum of a terabyte of storage) relative to plan, budget or benchmark.
Functional capacity (e.g. the number of specific documents that can be processed per
unit of time) relative to plan, budget or benchmark.
Usage factors as e.g. usage of systems resources at peak periods expressed as a total of
available capacity.
System downtime e.g. expressed as a percentage for all time and/or peak business
hours.
Outcome Indicators:
-
Page 5 of 103
Purpose
The investment category examines the returns relative to the outlays.
Investment
Financial
While there are many human stakeholders, these PIs relate primarily
to ICT staff and contractors. The indicators are required to provide
insights into factors such as: productivity, skill and qualification levels,
retention, and attendance.
Service
Procurement and
Contractual
Development
These are PIs covering the full Systems Development Lifecycle. While
these are of most applicability to ICT Projects they also have
applicability to software maintenance and enhancement initiatives.
Training &
Support
Operations
Systems
Page 6 of 103
PI Category
Purpose
Risk Management
Management and
Governance
Whereas the indicators for each of the above categories are in effect
indirect commentators on the state of ICT Operations or Project
Management and Governance, there are additional indicators that
reveal the effectiveness, suitability and professionalism of agency
approaches to these key disciplines.
2.5. PI Principles
The principles that apply to the selection and implementation of Performance Indicators include:
-
Where top level PIs are used, an effective drill down capacity must be present to
enable the determination of the problematic or successful lower order PIs.
The collection, analysis and reporting must have integrity (this implies accuracy and
completeness). A strategy to detect and remedy bias must be devised and
implemented (see section 2.10).
The measurement and reporting cost of PIs should be determined. As a general, but
not absolute principle, the cost of producing PIs should be orders of magnitude lower
than the value of the achievements or the cost of the problems they are seeking to
reveal.
PIs are only indicators. Determination of the causes of performance aberrations will
often require an analysis of the underlying situations and/or data.
Their responsibility for performance in the category being measured by the PI. This
relates to both indirect as well as direct responsibility. However the more indirect the
level of responsibility the less granular the PI should be, and the more it should only be
reported on an exception basis.
Their responsibility for the business function that will be impacted by ICT
performance. Such PIs are usually reflected in formal or informal Service Level
Agreements between business divisions and the ICT division.
The table below makes some suggestions as to categories of stakeholders and the nature of
appropriate PIs.
Page 7 of 103
Stakeholder Group
Cost-effectiveness of ICT.
Project Managers
Manager Systems
Development
Development progress.
Personnel performance.
Financial performance.
2.7. Escalation
An escalation approach should be agreed as a foundation element of determining the nature of
PIs and the levels and functions in the organisations that should review these.
The requirement for escalation arises where a PI primarily intended for a manager at a particular
level reaches a threshold where pre-agreed rules specify that an alert must be passed up the
line.
Page 8 of 103
How high such alerts need to travel is a detailed matter, but one that must be agreed between all
levels of management from the top down.
Page 9 of 103
2.8.1.1. Quantitative
Approaches to data collection include:
-
Synthetics. This represents the building of synthetic data from collected data.
Research standards and norms for benchmarking against other similar organisations
and the private sector with similar systems, e.g. banks.
Operations Research Technique may be drawn for industrial engineering and work
study which are generally quantitative.
2.8.1.2. Qualitative
Approaches to data collection include:
-
Qualitative data analysis. This includes analysis of user records by webmasters, system
administrators, etc.
Surveys and/or questionaries of take-up, approval and acceptance. Survey
questionnaires can be conducted by: mail, email, web, or occasionally face to face.
Needs to be well designed and piloted.
Telephone survey / interview. Telephone interviews may be more valid than either
written or face to face interview.
Face to face. This is really a questionnaire for which the information is collected by and
interviewer rather than being completed (penned) by the contributor.
Automated e-survey by email to users.
Community consultation.
Public, community, agency or stakeholder submissions.
Focus groups with 5-10 participants.
Nominal Groups. This is a brain storming type exercise to identify
problems, propose solutions and prioritise actions.
The normal group technique for brain storming, possibility thinking.
Observation of people should be unobtrusive (subjects should be
unaware so as to avoid influencing the result).
Diaries and activity logs.
Audit using different strategies, including interviews, desktop analysis and random or
targeted sampling across the following dimensions: vertical, horizontal, project,
department, procedure, and process.
Observation studies using check lists or other systemic forms.
Behavioural analysis. An analysis and interpretation of human behaviour, particularly in
terms of user behaviour changes, patterns, cycles, growth trends, but also of service
providers, such as help desk and call centre staff.
Custom analysis using: Demand and value (DAMVAM / DVAM); Accenture Public
Sector Value Model (PSVM); Cap Gemini / EU Performance Framework.
Page 10 of 103
What is to be reported. While the PI itself will obviously be reported, it will be necessary
to determine what else needs to accompany this to make the information usable and
actionable by the person/people to whom it is reported. Additional information could
include e.g. trends relative to the PI over previous periods, the status of subsidiary PIs to
provide more granularity of analysis, explanations for the good or bad deviations
provided by the responsible person.
To whom. This may be one or more persons and may be dependent upon the status of
the indicator. Status is intended to convey the degree of goodness or badness and
seriousness or lack of seriousness of the indicator. Rules need to be created e.g. if
the PI is within acceptable limits it should only be reported to the project manager. If the
PI is outside of acceptable limits it should also be reported to the CIO.
How often or when. This needs to accommodate a regular frequency of reporting for
some PIs, but an event driven approach for other PIs. Examples of the latter would
include project milestones or PIs reaching a trigger point where urgent action is required.
2.9.2. Visualisation
Visualisation relates to how the PI is represented. Whereas a text representation is obvious this
is often not the clearest or most impact-full way of showing a PI or collective of PIs.
Page 11 of 103
Traffic Light. The use of a green, amber, red approach to PIs enables the reader to
determine at a glance whether a situation is as expected (green), moving towards being
below expectations (amber) or seriously below expectations (red).
Graphs. These are particularly useful where the PI is most informative when viewed as
part of a trend or when compared with other linked or comparable factors.
2.10. Resourcing
In most situations, preparing and presenting and effectively using PIs is not a trivial exercise.
Many PIs require the gathering and analysis of data from a number of sources and its timely
presentation in the most appropriate format.
There are implications for the responsible person or people as well. For PIs to function properly
the readers or consumers must be in a position to receive and react to the PIs within
appropriate time windows.
The strategy developed around PIs must factor detail of how both of the above will be handled. It
must identify the persons responsible for collection, those responsible for analysis and
presentation and those responsible for reading and reacting to the information.
Logical points for the collection of data include:
-
Compliance functions.
Help desks.
The analysis and reporting of PIs is ideally best handled by persons who understand the context
of the Project or ICT Operations as a whole. They should be the first point to investigate
anomalous results. In this way they take on an integrity checking function. It is important to have
this function independent from and not responsible for the areas being measured by the PIs. In
effect they provide a quasi-audit function.
Page 12 of 103
In the case of managing metrics or performance indicators effectively within ones span of control,
professionals may monitor hundreds of metrics periodically if they require little and infrequent
attention and if they are of low complexity. However, in reality where complex intervention or
corrective action is necessary managers frequently have other responsibilities. In these situations
the maximum might best be limited to 7.
Most PIs are intended to be indicative rather than conclusive. They are also
intended to be an aid not a ball and chain. It is essential that any approach to the
structuring and use of PIs is able to contend with situations where deviation from PIs is
highly desirable and where single-mindedly managing Projects or Operations to achieve a
positive PI may be at odds with the best interests of the agency. Examples of such
situations include:
situations in which there have been changes to underlying
environmental variables, or where market conditions or stakeholder responses or
prudent risk management have made particular actions essential (changing the time or
priorities for a project; culling a project; culling functionality; etc)
For PIs that rely upon comparing the current achievement with e.g. benchmarks, plans,
budgets, prior experiences, etc it is essential to ensure that an apples-to-apples
comparison is being made. This requires that the people compiling and using the PIs
have significant insights into how both the actual and comparative components were
derived.
Some PIs may reflect a short term anomaly that does not require corrective action.
Most often these relate to timing issues.
Those preparing PIs need to be able to
recognise such instances and, flag them for the attention of those using the PIs. It may
also be necessary to apply smoothing to results to remove the impact of short term,
irrelevant deviations.
The potential for evaluation bias should be recognised and strategies put in place to
manage this. The risk of evaluation bias will particularly arise where the outcomes could
lead to loss of face or damage to reputation and threats to financial incentives or even
tenure of employment. Approaches to avoidance of evaluation bias include:
-
Page 13 of 103
Each of these provides detailed guides to implementing processes and practices that are
designed to yield optimal performance in both operations and development environments.
Agency business expectations dictate that ICT Performance Indicators allow for transparent and
timely registering of the levels attained in the areas of:
-
IT goals and metrics that define what the business expects from IT (what the business
would use to measure IT).
Process goals and metrics that define what the IT process must deliver to support ITs
objectives (how the IT process owner would be measured).
Process performance metrics (to measure how well the process is performing to indicate
if the goals are likely to be met).
CobiT lists four categories (and 34 sub-categories) of ICT activities that need to be managed
effectively 4 . CobiT then provides organisations with a structured approach to determining the
level of maturity of their management of ICT based upon a rating for each of the (relevant)
categories/sub-categories.
3.1.1. CobiT Category - Plan and Organise
PO1
PO2
PO3
PO4
This excerpt and others in this section are from: CobiT 4.0 downloadable from www.isaca.org/cobit
The CobiT Maturity Model derived from the Software Engineering Institutes Capability Maturity Model
see www.sei.cmu.edu/
Page 14 of 103
PO5
PO6
PO7
PO8
Manage quality.
PO9
PO10
Manage projects.
AI2
AI3
AI4
AI5
Procure IT resources.
AI6
Manage changes.
AI7
DS2
DS3
DS4
DS5
DS6
DS7
DS8
DS9
DS10
Manage problems.
DS11
Manage data.
DS12
DS13
Manage operations.
M2
M3
M4
Provide IT governance.
The CobiT Maturity Model provides organisations with a structured approach to determining the
level of maturity of their management of ICT based upon a rating for each of the (relevant)
categories/sub-categories listed above.
Page 15 of 103
CobiT determines the level of ICT maturity as being at one of six levels:
CobiT Level
Shorthand
Description
Explanation
Non Existent
Initial
Repeatable
Defined
Managed
Optimised
CobiT recommends that organisations identify where the organisation currently rates relative to
where they want the organisation to be and relative to eg industry averages.
The table below provides a cross-reference to the Better Practice Approaches contained in
Appendix A and the detailed PI Group Sheets contained in the Resource Catalogue.
Best Practice Methodologies
(summarised in Appendix A)
PI Category
Investment
Planning and
Evaluation
Financial
Management
Human Resources
Balanced Scorecard
Balanced Scorecard
Investors
standard
in
People
Applicable Performance
Group/s from the
Resource Catalogue
PICs 1,2,4,5,6
PICs 4,5,6
PICs 3,7
Page 16 of 103
PI Category
Applicable Performance
Group/s from the
Resource Catalogue
ITIL
Six Sigma
CobiT
Procurement and
Contractual
ITIL
CobiT
Operations
ITIL
CobiT
ITIL
CMMI
Southern Scope
Training &
Support
ITIL
PICs 3,7
Risk Management
AS/NZS 4360
PICs 7,8,16
Protective
Manual
ICT Security
Management
ISO/IEC 17799
ACSI33
Management and
Governance
CobiT
Service
Systems
Maintenance
PIC 7,8,9,10,11,12,13,14,15
PICs 7, 9,10,11,12
PIC 7,8,9,10,11,12,13,14,15
PICs 7,8,12,14
Security
PIC 16
PICs 2,3,5,7
Page 17 of 103
The level and areas of risk will vary dependent upon these dimensions and the scale and
complexity of the underlying business requirements.
The selection and use of Performance Indicators will be significantly informed by these factors.
Was the development delivered on time and on budget with most or all of the required
features and functions?
Was the system taken up and used by the anticipated user base?
Does the new system provide the required business benefits?
Page 18 of 103
Weighting
Executive Support
18
User Involvement
16
14
12
Minimised Scope
10
Formal Methodology
Reliable Estimates
Further analyses of these factors are available from Extreme CHAOS (2001).
In selecting PIs to monitor and assess ICT Projects it is important to ensure that they can act as
indicators of the degree of compliance of the project with these success factors.
4.2.2. Reasons for Failure
The UK Office of Government Commerce (OGC) identifies the causes of IT project failures as:
-
People failures.
Page 19 of 103
Causes of Failure
Impacts on Projects
Prime
responsibility
rests
with
committees.
Consensus must be achieved on all
issues.
No single individual in authority project
manager makes decisions in absence of
sponsor.
Page 20 of 103
Causes of Failure
Impacts on Projects
The key early part of the project is
confused by contractual debate and
positioning often leaving both sides
disappointed.
This mistrust is then exacerbated by
misunderstanding of supplier and project
motivations creating further disputes and
resort to contract leading to a culture of
secrecy and sides
Ultimately, the project focuses
energies on blame for failure.
its
People Failures
-
In selecting PIs to monitor and assess ICT Projects it is important to ensure that they can act as
indicators of the degree of compliance of the project with these success factors.
2nd Phase
Contract
3rd Phase
Operational
Pre-contract
Contract
Post-contract
Strategic
Design
Project
Implementation
Operational
(Service Delivery)
Structural
Development
Monitoring
Budget
System Testing
Continual Improvement
Approval
& Acceptance
Planning
Page 21 of 103
1st Phase
Planning
2nd Phase
Contract
3rd Phase
Operational
Intellectual
Physical
Service Delivery
Activity
Activity
Typical KPI-
Typical KPIProgress,
Testing,
Acceptance.
Success
KPI
Typical KPI
Service Delivery
User Satisfaction
Growth
In the strategic design and planning phase activities should to be driven, or anchored by a strong
focus on needs driven realisable benefits. These planned benefits should have realistic,
attributed monetary values enabling comparison to investment costs. This focus on benefits will
enable cost benefit planning.
However, in the final assessment, after benefits have been established and demonstrated,
assessment should be of the net realised benefits (NRB). The NRB will be compared to planned
benefits. The NRB commonly differ from the planned benefits as a result of scope creep,
unplanned benefits, planned benefits not realised and disbenefits.
Different types of performance indicators may apply to different activities within different phases of
the lifecycle. However, some indicators will be planned early for later measurement, e.g., benefits
will be planned early in phase one, may be measured early in phase three, but can not be
validated until service delivery is established and settled.
Other KPIs may have similar common labels, but the content may be different, e.g., one might
have output KPIs for phases 1 and 2, but one will relate to planning outputs and the other to
project implementation outputs.
The critical KPIs generated from planning (phase 1) will be predictive success indicators. Many
success indicators may not be able to be fully identified until the phase is nearing closure. At the
end of phase 1, success indicators will give the owner, financier and Government a good
indication of the likelihood of success and will contribute to the decision of whether or not to
proceed with the project and commit resources to it, or whether to return to the drawing board.
Figure 3 below is intended to illustrate both the hierarchical nature of PIs and KPIs, as well as
their positioning across the systems development lifecycle.
Page 22 of 103
Phase 1
Phase 2
Phase 3
Planning
Contract
Operational
Project
Success
High Level
Investment
Monitoring
Validation
for
Indicators
Continual
Improvement
Benefits
Realisation
Indictors
Project Mgmt.
Verification
Prediction
Strategic
Priorities
and
Outcomes
Outputs
Benefits
KPIs
Project KPIs
Operational
KPIs
N.B. KPIs
are
escalated
to an
appropriate
level
KPIs
PIs
KPIs
PIs
PIs
KPIs
KPIs
PIs
PIs
PIs
Concept
Logic
PI Category
Investment
Planning and
Evaluation
Balanced Scorecard
Applicable Performance
Group/s from the
Resource Catalogue
PICs 1,2,3,4,5,6,12
Page 23 of 103
PI Category
Applicable Performance
Group/s from the
Resource Catalogue
Balanced Scorecard
Human Resources
Investors
standard
Service
ITIL
Six Sigma
CobiT
Procurement and
Contractual
ITIL
CobiT
Project
Management
PRINCE2
PIC 1,3,7
Operations
ITIL
PICs 3,7,8,9,10,11,12,13,14,15
CobiT
Systems Analysis,
Design, Effort
Estimation,
Management
ITIL
CMMI
Southern Scope
Systems Testing
ITIL
CMMI
Training &
Support
ITIL
PICs 3,7,8,14
Risk Management
AS/NZS 4360
PICs 8,16
Protective
Manual
ICT Security
Management
ISO/IEC 17799
ACSI33
Management and
Governance
CobiT
PRINCE2
Financial
Management
in
People
PICs 1,2,4,5,6
PICs 3,7
PICs 3,7,8,9,10,11,12,13,14,15
PICs 7, 9,10,11,12
PICs 1,3,4,5,7,12,14,15
PICs 7,8,14
Security
PICs 8,16
PICs 1,2,3,4,5,6,7
Page 24 of 103
Description
A single item, system or resource that has a current or future financial or
non- financial value.
Asset
Benefit
Business Intelligence
Bias
Capital Cost
Certification Body
Commonwealth
Commonwealth of Australia.
Cost
cpu
Criticality
Critical Success
Factor
Digital Dashboard
Disbenefits
Enhancement
Goal
Governance
Page 25 of 103
Term
Description
Industrial and
Organisational
Psychologist
Interoperability
Impact
Investment
Kaizen
Key
Key Performance
Indicator
A value indicator the has been selected and applied as key to assessing
the value of a project, asset, activity, etc.
Lagging Indicators
Leading Indicator or
Lead Indicators
Lifecycle
Loss
Page 26 of 103
Term
Description
Management
Metric
A measure or measurement.
Objective
Operating Cost
Operations Research
Outcome
Output
Performance
Indicator
Probability
Quality
QA Certification Body
Quality Assurance
Replacement
Resources
Resourcing
Page 27 of 103
Term
Description
Risk
Risk Management
RoI
Scope
Scope Creep
Six Sigma
Span of Control
Stakeholder/s
Total Cost
Upgrade
Value
Value Management
Page 28 of 103
Description
Link
AGIMO
AGIMI website
ANAO
ANAO Website
AM
Asset Manager
BI
Business Intelligence
C&N
CGG
Procurement
Guide
CISA
ISACA
CISM
ISACA
CobiT
CobiT
BI Link
ITGI
ISACA
CIO
CEO
CFO
CoE
Centre of Excellence.
DMAIC
OGC-ITIL
DoFA
Finance
EAP
Finance
HRM
IIMA
IPO
ISACA
ISACA
ISO
ISO Website
ISO 9000
ITGI
ITGI at ISACA
ITIL
OGC-ITIL
Finance
IIMA
Page 29 of 103
Acronym
Description
Link
JAS-ANZ
KISS
KBI
KGI
KPI
KVI
LO
NOIE
OGC
PI
Performance Indicator
PM
Project Manager
PMgr.
Project Manager
PR
Public Relations
PRINCE2
PWC
QA
Quality Assurance
QMS
QMS
at
Global
SAI Global
SAI Global
SLA SLAs
ITIL - SLA
SocITM
SocITM
SocITM KPI set
SWOT
Strength,
analysis
UK or
Weaknesses,
Opportunities
and
New
JAS-ANZ
OGC
PRINCE2
SAI
Threats
UK of GB
US or USA
Page 30 of 103
APPENDICES
Page 31 of 103
AGIMO is currently developing an expanded version called the ICT Business Case Guide and Tools
as part of its ICT Investment Framework. This will incorporate the DAM-VAM.
See: http://www.agimo.gov.au/government/damvam.
Financial.
Customer.
Internal business processes.
Learning and growth.
Page 32 of 103
A2 Project Management
A2.1 PRINCE2 7
PRINCE (Projects in Controlled Environments) is a project management method covering the
organisation, management and control of projects. PRINCE was first developed by the Central
Computer and Telecommunications Agency (CCTA) now part of the Office of Government
Commerce (OGC) in 1989 as a UK Government standard for IT project management.
Since its introduction, PRINCE has become widely used in both the public and private sectors and
is now the UK's de facto standard for project management. Although PRINCE was originally
developed for the needs of IT projects, the method has also been used on many non-IT projects.
The latest version of the method, PRINCE2, is designed to incorporate the requirements of existing
users and to enhance the method towards a generic, best practice approach for the management of
all types of projects.
The design and development work was undertaken by a consortium of project management
specialists, under contract to OGC, and over 150 public and private sector organisations were
involved in a Review Panel which provided valuable input and feedback to the consortium.
PRINCE2 is a process-based approach for project management providing an easily tailored and
scaleable method for the management of all types of projects. Each process is defined with its key
inputs and outputs together with the specific objectives to be achieved and activities to be carried
out.
The method describes how a project is divided into manageable stages enabling efficient control of
resources and regular progress monitoring throughout the project. The various roles and
responsibilities for managing a project are fully described and are adaptable to suit the size and
complexity of the project, and the skills of the organisation. Project planning using PRINCE2 is
product-based which means the project plans are focused on delivering results and are not simply
about planning when the various activities on the project will be done.
Source: http://www.ogc.gov.uk/prince2/about_p2/about_intro.htm
Page 33 of 103
A PRINCE2 project is driven by the project's business case which describes the organisation's
justification, commitment and rationale for the deliverables or outcome. The business case is
regularly reviewed during the project to ensure the business objectives, which often change during
the lifecycle of the project, are still being met.
Benefits
PRINCE2 is a structured method providing organisations with a standard approach to the
management of projects. The method embodies proven and established best-practice in project
management. It is widely recognised and understood, and so provides a common language for all
participants in the project.
PRINCE2 provides benefits to the organisation, as well as the managers and directors of the
project, through the controllable use of resources and the ability to manage business and project
risk more effectively. PRINCE2 enables projects to have:
-
Page 34 of 103
Service Support.
Service Delivery.
Application Management.
Security Management.
See: http://www.itil.co.uk/index.htm.
Page 35 of 103
Although the CMM has proved useful to many organisations, the use of multiple models has been
problematic. Further, applying multiple models that are not integrated within and across an
organisation was seen as costly in terms of training, appraisals, and improvement activities. The
CMM Integration project was formed to sort out the problem of using multiple CMMs.
The resultant Capability Maturity Model Integration (CMMI) is a process improvement approach
that provides organisations with the essential elements of effective processes. It can be used to
guide process improvement across a project, a division, or an entire organisation. CMMI helps
integrate traditionally separate organisational functions, set process improvement goals and
priorities, provide guidance for quality processes, and provide a point of reference for appraising
current processes.
See: http://www.sei.cmu.edu/cmmi/cmmi.html.
The CobiT Maturity Model derived from the Software Engineering Institutes Capability Maturity Model see
www.sei.cmu.edu/
Page 36 of 103
Page 37 of 103
provides a sensible, integrated conceptual framework for linking the variety of human
resource development activities upon which APS agencies have been embarking in recent
years;
encourages intelligible cascading of business-focused people development and
management strategies, by demanding disciplined consideration of how HRM strategies
can support the achievement of business imperatives;
provides useful documentation...concrete products that, when utilised in a transparent way,
help (agencies) make choices and plan action;
does not prescribe a process; hence, it provides the flexibility necessary for each agency to
develop HRM strategies most appropriate to its unique business circumstances and culture.
For a summary of The Investors in People 2000 Indicators and Supporting Evidence Guidelines
see: http://www.apsc.gov.au/iip/iipindicators.doc.
Source: http://www.apsc.gov.au/iip/iipfactsheet.doc
Page 38 of 103
Case studies: examples of how the southernSCOPE method was applied successfully on
two Victorian government software projects.
Presentation for project sponsors: a Microsoft PowerPoint presentation of 30 minutes
duration providing project sponsors with information on the approach and benefits of the
southernSCOPE method.
Computer based training package: a two hour computer-based training course to support
project managers during the software development and teaching the why, when and how to
apply the southernSCOPE method.
10
Page 39 of 103
Reference manual: specifying the approach and the roles of the different parties involved
(also includes a draft contract between the developer and the customer).
Page 40 of 103
Security Policy.
Organisational Security.
Personnel Security.
Access control.
Page 41 of 103
Compliance.
ISO/IEC 17799, together with ACSI33 (see below) provides the basis for agencies development of
ICT security Performance Indicators covering the physical and logical (systems related) security for
premises, hardware, software, networks and personnel.
See: http://www.standards.com.au/
Page 42 of 103
Guides
Standards
Triple Bottom-line:
Global Reporting Initiative http://www.globalreporting.org/
Institute for Enterprise Architecture
Developments
Enterprise Architecture
http://www.enterprise-architecture.info/
Human Resources
Knowledge Management
Page 43 of 103
Subject Area
Guides
Process Management
Project Management
PRINCE2: http://www.ogc.gov.uk/prince2/
Standards
ISO 9000
southernSCOPE see:
http://www.egov.vic.gov.au/index.php?env=innews/detail.tpl:m1816-1-1-7:l0-0-1-:n8320-0
Risk Management
Page 44 of 103
Reference Descriptions
Resource Link/Ref.
Australian Government
AGIMO, Department of
Finance
and
Administration
Australian Government
AGIMO, Department of
Finance
and
Administration
Australian Government
Australian Government
Cranfield University, UK
Doc.
No
ISRC-BM200001.
Not on web.
m.bridge@cranfield.ac.uk
Government of Victoria
NOIE, 2002
Online Authentication
PA Consulting Group
Price
Waterhouse
Coopers Sept., 1999
Sida document
Sida Website
SocITM, UK
Intro
to
Chronicle
UK Government
OGC-ITIL
UK Government
Office of
(OGC)
OGC
UK Government
PRINCE2, OGC, UK
Standish
USA.
Group,
2003,
Government
Commerce
Version
PRINCE2
Page 45 of 103
Organisation
Reference Descriptions
Resource Link/Ref.
CobiT
CobiT
Eric T Peterson
Web
Analytics
Demystified
ISBN 0-9743584-2-8
Page 46 of 103
TYPE
Value
Y/N
Explanation
Type
Explanation
Yes or No
Absolute value
Percentage
Dollar Value
Number
H/M/L
Page 47 of 103
This group of Performance Indicators should be used to test the efficacy of ICT project
establishment and ongoing management.
Sources
Section
Ref:
Performance Group
Description
Suc
Ref No
1
Sub-Group
Description
Pre-Project Planning
Sub
Ref:
Performance
Indicator
Indicator Type
Indicator
Values
1.0
Y/N or H/M/L
2.0
Y/N or H/M/L
3.0
Y/N or H/M/L
4.0
Y/N or H/M/L
5.0
Y/N or H/M/L
6.0
Y/N or H/M/L
7.0
Y/N or H/M/L
8.0
Y/N or H/M/L
Page 48 of 103
Section
Ref:
Performance Group
Description
Suc
Ref No
Sub-Group
Description
Sub
Ref:
Performance
Indicator
Indicator Type
Indicator
Values
9.0
Y/N or H/M/L
10
10.0
Y/N or H/M/L
11
11.0
Y/N or H/M/L
12
12.0
Ownership. Has the project ownership been clearly defined and has
this been agreed by all relevant stakeholders?
Y/N or H/M/L
13
13.0
Y/N or H/M/L
Page 49 of 103
This group of Performance Indicators relates to the after the fact outcomes from an ICT project
Sources
Section
Ref:
Performance Group
Goal Indicators
Description
Suc
Ref No
Sub-Group
Description
Sub
Ref:
Performance
Indicator
Indicator Type
Indicator
Values
10
2.0
% or H/M/L
3.0.
% or H/M/L
4.0
% or H/M/L
5.0
% or H/M/L
6.0
% or H/M/L
7.0
% or H/M/L
8.0
9.0
Y/N
Page 50 of 103
Section
Ref:
Performance Group
Goal Indicators
Description
Suc
Ref No
Sub-Group
Description
Sub
Ref:
Performance
Indicator
Indicator Type
Indicator
Values
10
10.0
# and %
11
11.0
% or H/M/L
12
12.0
Number of users and cost per user. What is the number of users
and their unit cost?
# and $
Y/N or H/M/L
13
13.0
Page 51 of 103
This group of Performance Indicators should be used to evaluate the performance of personnel as well the
commitment of management and the satisfaction of users.
Sources
Investors in People Standard; SocITM; Capability Maturity Model; ISO 9000; Six Sigma
Section
Ref:
Performance Group
Description
Stake
Ref No
Sub-Group
Description
Sub
Ref:
Performance
Indicator
Primary Source
of PI
Link if
Known
Stakeholders
1
Agency Management
1.0.0
1.0.1
1.0.2
1.0.3
1.0.4
1.0.5
1.0.6
Principles from:
ISO 9000
Standard,
Business
Planning,
Victoria State
Government
Page 52 of 103
Section
Ref:
Performance Group
Description
Stake
Ref No
Sub-Group
Description
Stakeholders
Sub
Ref:
Performance
Indicator
Primary Source
of PI
1.1.0
1.1.1
1.1.2
PR,
Legal Counsel or
LO,
PM Team,
C&N,
Risk
Management
Team,
HRM, Security,
Iindl & Orgl
Psychologist,
Facilities
Manager
1.1.3
1.1.4
1.1.5
1.1.6
1.1.7
Link if
Known
Personnel
2
2.0
2.1
2.2
2.3
2.4
Page 53 of 103
Section
Ref:
Performance Group
Description
Stake
Ref No
Sub-Group
Description
Sub
Ref:
Performance
Indicator
Primary Source
of PI
Link if
Known
HR
3
HR
3.0
3.1
3.2
3.3
3.4
3.5
3.5.0
3.5.1
3.5.2
3.5.3
3.6
3.7
3.8
3.9
3.10
3.11
3.12
3.13
3.14
Management &
HRM.
ITGI
Based on
Victoria State
Government
KPIs
PA Consulting
Group
AGIMO
www.ITGI.org
ISACA.org
Page 54 of 103
Section
Ref:
Performance Group
Description
Stake
Ref No
Sub-Group
Description
Sub
Ref:
Performance
Indicator
Primary Source
of PI
Link if
Known
4.0
4.1
4.2
4.3
Competent
Planner
ICT competence of
employees
5.0
SocITM KPI 10
SocITM
Investment KPI
Service Enhancement:
% of customer satisfaction with service. (Design survey
questionnaire.)
Use sufficient random survey sample, not best responses.
State sample size
Use a scale of 1-7 where 1 is poor and 7 is excellent.
Survey % of all users surveyed
Survey % of users with recent problem/s
Use direct question such as, How do you rate the overall ICT
service you receive?
SocITM KPI 1
5.1
5.2
5.3
6.0
6.0.0
6.0.1
6.0.2
6.0.3
6.0.4
6.0.5
6.0.6
Page 55 of 103
Section
Ref:
Performance Group
Description
Stake
Ref No
Sub-Group
Description
Sub
Ref:
6.0.7
6.0.8
6.0.9
6.0.10
6.0.11
6.0.12
6.0.13
7.0
Performance
Indicator
State whether internal or public users
% of user responded
State average score per respondent
Indicate if independent/external surveyor engaged
Date of survey
Frequency of survey
Change in survey result
Number of offline transactions.
Primary Source
of PI
Link if
Known
SocITM KPI 9
SocITM Access
KPI
7.0.1
7.0.2
7.0.3
7.0.4
7.0.5
8.0.0
8.0.1
8.1.2
8.1.0
8.1.1
8.1.2
8.1.3
Services Generally
Reasons for satisfaction
Reason for dissatisfaction
Service Enhancement:
See Stake 6 above. User Satisfaction
Number of offline transaction.
Ratio of online transaction growth and other channels growth.
SocITM KPI 1
Community (Public)
8
Based on
Victoria State
Government
KPIs
Page 56 of 103
Section
Ref:
Performance Group
Description
Stake
Ref No
Sub-Group
Description
Sub
Ref:
Performance
Indicator
8.2.0
8.2.1
8.2.2
8.2.3
8.3.0
8.3.1
8.3.2
8.3.3
8.3.4
8.3.5
New Services:
No. of new services.
% of service available online and offline.
% of services on line.
Service Enhancement Responsiveness:
Number of people benefiting from increased responsiveness.
Average transaction process time.
Average cycle time.
Average turnaround time.
% of users satisfied with service responsiveness.
9.0.0
9.0.1
9.0.1.2
9.0..1.3
9.0.1.4
9.0.1.5
9.0.2
9.0.3
9.0.4
9.0.5
9.0.6
9.0.7
9.0.8
Primary Source
of PI
SocITM KPI 91
Link if
Known
SocITM Comm
Access KPI
Victoria State
Government
Page 57 of 103
Section
Ref:
Performance Group
Description
Stake
Ref No
Sub-Group
Description
Sub
Ref:
9.0.9
9.0.10
9.0.11
Community participation
9.1
9.2
9.3
9.4
9.5
9.6
9.7
Performance
Indicator
% of vulnerable population with internet access at home.
Serviced offered to people with special needs.
Serviced offered to the disadvantaged.
Primary Source
of PI
Victoria State
Government,
Link if
Known
Page 58 of 103
Section
Ref:
Performance Group
Description
Stake
Ref No
10
Sub-Group
Description
e-Govt. e-Commerce Also
see:
e_Government
Sub
Ref:
10.0.0
10.0.1
10.0.2
10.0.3
10.0.4
10.0.5
10.0.6
10.0.7
10.1.0
10.1.1
10.1.2
10.1.3
10.2.0
10.2.1
10.2.2
Performance
Indicator
e-Government Services (See Stake 6 above. User Satisfaction)
% of customer satisfaction with service
Number of offline transaction
Number of hits on home page
Number of hits on service pages
Number of unique visitors
% of returning users
% of transaction occurring on line
Ratio of online transaction growth and other channels growth.
New Services:
Number of new services
% of service available online and offline
% of services on line
Service Convenience:
Serviced offered to people with special needs.
Serviced offered to the disadvantaged.
Primary Source
of PI
SocITM
Link if
Known
SocITMeSer Del
KPI
Victoria State
Government,
Page 59 of 103
This group of Performance Indicators should be used to measure the investment effectiveness of Enterprise ICT and
ICT Management
Sources
AGIMO Demand and Value Assessment; Triple Bottom Line; Balanced Scorecard: Capability Maturity Model
Section
Ref:
Performance Group
ICT Investment
Description
Invest
Ref No
1
Sub-Group
Description
In ICT by employee
Sub
Ref:
Performance
Indicator
1.0
All budgeted ICT and ICT related expenditure for the current
financial year, including staffing, equipment and software for
the whole agency, not just the ICT unit.
1.1
1.2
PA Consulting
Group
1.3
PA Consulting
Group
1.4
Primary Source
of PI
SocITM KPI 14
In infrastructure
2.0
2.1
PA Consulting
Group
3.0
SocITM KPI 90
Link if
Known
SocITM
KPI
Investment
SocITMInvest/Cap
KPI
Page 60 of 103
Section
Ref:
Performance Group
ICT Investment
Description
Invest
Ref No
4
Sub-Group
Description
IP protection
Sub
Ref:
Performance
Indicator
Primary Source
of PI
4.0
OGC PRINCE2
OGC PRINCE2.
AGIMO.
Auckland
University, NZ.
6.2
7.0
7.1
AGIMO
ICT Business
Case
Development
Guide
4.1
5
5.0
5.1
5.2
5.3
6.0
6.1
Business Case
Link if
Known
Project
Template,
University
Scoping
Auckland
Page 61 of 103
Section
Ref:
Performance Group
ICT Investment
Description
Invest
Ref No
Sub-Group
Description
Sub
Ref:
Performance
Indicator
Primary Source
of PI
Project Approval
8.0
8.1
8.2
8.3
OGC PRINCE2
or
AGIMO
Financial / Economic
9.0
9.1
9.2
9.3
9.4
9.5
9.6
9.7
9.8
9.9
9.10
9.11
9.12
9.13
9.14
9.15
Link if
Known
Page 62 of 103
This group of Performance Indicators examines the alignment of ICT investment with Government business
objectives
Sources
Section
Ref:
Performance Group
Description
Biz
Ref No
Sub-Group
Description
Sub
Ref:
Performance
Indicator
Primary Source
of PI
1.0
1.1
Agency Strategic
Business
Planning
Committee
Portfolio management
2.0
OGC
3.0
Agency Strategic
Business
Planning
Committee
Link if
Known
OGC
Portfolio
Mgmt guide
Page 63 of 103
Section
Ref:
Performance Group
Description
Biz
4
Shared services
Governance or Compliance
with governance
4.0
4.1
4.2
4.3
4.3.0
4.3.1
4.3.2
4.3.3
Agency
Secretary, LO
and CIO
Stakeholders
5.0
5.1
Agency
Secretary, LO
and CIO.
Shared
Services, PWC,
9-01999
6.0
6.1
6.2
6.3
6.4
6.5
6.6
Project mandate
Project brief
Benefits case
Governance plan
Budget approval
Approval Qualifications, e.g., comply with special condition
Project plans
Agency, and
Stakeholders,
Secretary, CIO,
Risk Manager,
Budget Group,
and AGIMO,
Finance
7.0
PA Consulting
Group
7.1
Page 64 of 103
This group of Performance Indicators examines the extent to which benefits, whether foreshadowed in budgets and plans
or not, been achieved.
Sources
AGIMO Demand and Value Assessment; Triple Bottom Line; Balanced Scorecard: Capability Maturity Model
Section
Ref:
Performance Group
Benefits Realisation
Description
Cranfield
University
Ben
Ref No
1
Sub-Group
Description
Organisational
Sub
Ref:
Performance
Indicator
1.0
Agency
Secretary, CIO
and Benefits
Manager
2.0
2.1
Agency
Secretary, CIO,
and Benefits
Manager,
Budget Group
and AGIMO,
Finance
3.0
Agency Benefits
Manager
3.1
1.1
Government
Community
Primary Source
of PI
Link if
Known
Page 65 of 103
Section
Ref:
Performance Group
Benefits Realisation
Description
Cranfield
University
Ben
Ref No
Sub-Group
Description
Sub
Ref:
Performance
Indicator
Primary Source
of PI
Financial
4.0
Agency CIO
Budget Group,
Finance.
Acceptance Criteria
5.0
Agency
Secretary, CIO,
Quality Manager
and Benefits
Manager
Link if
Known
Page 66 of 103
This group of Performance Indicators provides a basis to measure and monitor the performance of ICT
Management
Sources
Section
Ref:
Performance Group
ICT Management
Description
Mgt
Ref No
1
Sub-Group
Description
Sub
Ref:
Performance
Indicator
Primary Source
of PI
Approval Requests
1.0.0
1.0.1
1.0.2
1.0.3
1.2.0
1.2.1
1.2.2
1.2.3
AGIMO Business
Case Guide v2.0
and Business
Case
Development
Tool
1.2.4
1.2.5
1.2.6
1.2.7
1.2.8
1.2.9
1.2.10
Link if
Known
Page 67 of 103
Section
Ref:
Performance Group
ICT Management
Description
Mgt
Ref No
Sub-Group
Description
Sub
Ref:
1.2.11
1.2.12
1.2.13
1.2.14
1.2.15
1.2.16
Business Plan
2.0
2.0.1
Performance
Indicator
Primary Source
of PI
Agency Corporate
Services
Agency business cycles have been identify and taken into account
2.1
Consult related
agencies.
AGIMO, Finance.
2.2
ANAO Business
Continuity Guide
2.3
2.4
Link if
Known
ANAO
Guide
ANAO
Mgmt
BC
BC
Centrelink,
Defence and
Finance
Handbooks
PRINCE2
Not on Internet
OGC
Causes
Failure
PRINCE2a
of
Page 68 of 103
Section
Ref:
Performance Group
ICT Management
Description
Mgt
Ref No
Sub-Group
Description
Sub
Ref:
2.5
Financial Plan
Governance Plan
Performance
Indicator
Primary Source
of PI
Link if
Known
Agency Strategic
Business
Planning Group
Finance (DoFA)
Contact Budget
Group,
Finance.
4.0
4.1
4.2
4.3
AGIMO,
OGC PRINCE2
ITIL,
ITGI & ISACA
SAI Global
ISACA.org
www.ITGI.org
ISACA.org
SAI Global
ISO 9000
4.4
4.5
4.6
4.7
4.8
3.0
3.1
3.1.1
3.1.2
3.1.3
3.2
3.3
3.4
Page 69 of 103
Section
Ref:
Performance Group
ICT Management
Description
Mgt
Ref No
Sub-Group
Description
Sub
Ref:
4.9
Performance
Indicator
Established quality committee with authority to manage audits and
implement corrective and preventative actions and auditor
recommendations.
Primary Source
of PI
4.10
4.11
4.12
Architectural Plan
5.0.0
5.0.1
5.0.2
5.0.3
Expandability plan
Refresh plan
Change plan
Life-cycle plan - longevity - refresh
Competent Expert
/ Consultant
Technical Management
5.1.0
5.1.1
SocITM KPI 3
ITIL
5.1.2
5.1.3
5.1.4
5.1.5
5.1.6
5.1.7
Link if
Known
SocITM % of
Suc prjts KPI
OGC-ITIL ITIL
1
ITIL 2
Page 70 of 103
Section
Ref:
Performance Group
ICT Management
Description
Mgt
Ref No
Sub-Group
Description
Sub
Ref:
Performance
Indicator
Primary Source
of PI
Software Plan
6.0
Applications plan.
Licensing plan.
Consultant
Security Management
Systems (Policy,
Procedures) and Plan
7.0
Consultant
7.1
ANAO guide
ANAO
Guide
ANAO
Mgmt
BC
BC
7.2
7.3
8.0
SAI Global
PRINCE2 Risk
Assessment
Tools
ITIL
OGC-ITIL
1
ITIL 2
8.1
8.2
8.3
8.4
8.5
8.6
8.7
9
Link if
Known
9.0
ITIL
Page 71 of 103
Section
Ref:
Performance Group
ICT Management
Description
Mgt
Ref No
10
Sub-Group
Description
Project Implementation
Plan
Sub
Ref:
Performance
Indicator
Primary Source
of PI
Link if
Known
AGIMO
PRINCE2
PRINCE2a
ISO 9000
PRINCE2
CIU guide to
Preparing
Implementation
Plans
OGC
PRINCE2a
Legal Counsel
ISO 9000 - 2000
Decision Map
10.3
Page 72 of 103
Section
Ref:
Performance Group
ICT Management
Description
Mgt
Ref No
Sub-Group
Description
Sub
Ref:
Performance
Indicator
Primary Source
of PI
Link if
Known
10.4
AGIMO (coming
soon.
OGC
OGC Gateway
Review
Process
10.5
Benefits management plan has been fully committed to since the concept
formulation and through business case development and project
delivery to investment validation.
Benefits management methodology (policy, procedures and practices)
such as employed by the Victorian Government.
OGC.
Cranfield
University, UK.
Victoria Govt.
OGC Benefits
Mgmt.
10.6
CPGs.
Finance (DoFA)
OGC PRINCE2
10.7
Stakeholder management.
PRINCE2
10.8
OGC PRINCE2
PRINCE2a
OGC PRINCE2
Page 73 of 103
Section
Ref:
Performance Group
ICT Management
Description
Mgt
Ref No
11
Sub-Group
Description
Sub
Ref:
Performance
Indicator
11.0
11.1
11.2
11.3
11.4
11.5
11.6
Primary Source
of PI
Link if
Known
SAI Global
ISO 9000
11.8
11.9
11.10
12
Project Delivery
12.0
12.1
12.2
12.3
PA Consulting
Group
Page 74 of 103
Section
Ref:
Performance Group
ICT Management
Description
Mgt
Ref No
13
Sub-Group
Description
Sub
Ref:
Performance
Indicator
Primary Source
of PI
12.4
AGIMO
13.0
OGC PRINCE2
AGIMO
ISO 9000 2000
13.1
13.2
13.3
13.4
Records management
Document and data control
In and out communications register
Client and stakeholder liaison, consultation, relationship building
Link if
Known
14
Portfolio Management
14.0
Verify that the project is being managed effectively within the context
of the agencys full portfolio of programs and projects.
OGC
OGC Portfolio
Mgmt guide
15
Change Management
and Control.
15.0
AGIMO
AGIMI website
16
Project Office
16.0
AGIMO
AGIMI website
17
17.0
ICT Management
Post-Project
Review
Page 75 of 103
Sources
ITIL; AS/NZS4360
Section
Ref:
Performance Group
Description
Gen
Ref No
Sub-Group
Description
Sub
Ref:
Performance
Indicator
Primary Source
of PI
Link if
Known
Mission Statement
1.0
ITIL
OGC-ITIL
1
ITIL 2
ITIL
2.0
ITIL
OGC-ITIL
1
ITIL 2
ITIL
3.0
ITIL
OGC-ITIL
1
ITIL 2
ISO 9000
ITIL
Customer Satisfaction
4.0
ITIL
OGC-ITIL
1
ITIL 2
ITIL
4.1
PA Consulting
Group
Page 76 of 103
Section
Ref:
Performance Group
Description
Gen
Ref No
Sub-Group
Description
Sub
Ref:
5.0
5.1.0
5.1.1
5.1.2
5.1.3
5.1.4
Capacity Management.
(Applied to HR, PC,
Middleware, Mainframe,
LAN, WAN, Portals, etc.)
Service Delivery
Performance
Indicator
Primary Source
of PI
Various SLA.
Record of Incident occurrence
Definition of incident
Resolution of incident
Time take to resolve incident
Confirmation to user that the issue has been resolved
ITIL
6.0
6.1
6.2
6.3
6.4
Total capacity
Capacity expandability
Peak volume
Total volume
Demand growth
AGIMO
7.0
7.1
7.2
7.3
PA Consulting
Group
Link if
Known
OGC-ITIL
1
ITIL 2
ITIL
Page 77 of 103
This group of Performance Indicators examines the indicators appropriate to measuring workstation costs.
Sources
SocITM; IEEE;
Section
Ref:
Performance Group
PC / Workstation Costs
Description
PC
Ref No
1
Sub-Group
Description
Acquisition cost per
workstation
Sub
Ref:
1.0
1.1
1.2
2
2.0
2.1
Performance
Indicator
Primary Source
of PI
SocITM KPI 4
SocITM KPI 7
Link if
Known
SocITM Cost of
PCs KPI
SocITM
cost calc.
W/S
SocITM support
KPI
SocITM
Support
Calc.
W/S
Cost
Page 78 of 103
Section
Ref:
Performance Group
PC / Workstation Costs
Description
PC
Ref No
3
Sub-Group
Description
Workstation support per
support specialist
Sub
Ref:
3.0
3.1
Distributed computing
4.0
capability (savings/income)
Performance
Indicator
Primary Source
of PI
SocITM KPI 8
Agency
Secretary, CIO,
Risk Manager
and Security
Manager
IBM
Link if
Known
SocITM
Support/spec't
KPI
SocITM Cost of
Supprt
Spec't
Calc
IBM-Grid
Grid.Org
IEEE Distributed
Computing
Page 79 of 103
PIC10 - Servers
Purpose and Context
This group of Performance Indicators examines the performance indicators in relation to computer servers.
Sources
SocITM; IEEE
Section
Ref:
Performance Group
Servers
Description
Mid
Ref No
1
Sub-Group
Description
Sub
Ref:
Performance
Indicator
1.0
SocITM KPI 4
SocITM KPI 7
SocITM KPI 8
1.1
1.3
2
2.0
2.1
3.0
Primary Source
of PI
Link if
Known
SocITM Cost of
PCs KPI
SocITM
KPI
support
SocITM
Support
Calc.
W/S
Cost
SocITM
Support/spec't KPI
Page 80 of 103
Section
Ref:
Performance Group
Servers
Description
Mid
Ref No
Sub-Group
Description
Sub
Ref:
3.1
Performance
Indicator
support staff, determine the No of servers per support person
and the cost per support person.
Sample support cost per server calculation
Primary Source
of PI
SocITM Cost of
Support
Spec't
Calc
Utilisation Factor
4.0
4.1
Agency CIO
Down time
5.0
Agency CIO
6.0
6.1
6.2
Agency CIO
Risk Manager
and Disaster
Recovery
Manager
7.0
UPS
Link if
Known
Page 81 of 103
PIC11 - Mainframe
Purpose and Context
This group of Performance Indicators examines the performance indicators relevant to an evaluation of performance
and price of mainframe computers.
Sources
SocITM; IEEE
Section
Ref:
Performance Group
Mainframe
Description
MFr
Ref No
Sub-Group
Description
Sub
Ref:
Performance
Indicator
Primary Source
of PI
Performance
1.0
1.1
1.2
1.3
1.4
1.5
Capacity MIPS
CPU Utilisation Factor at peak times and overall.
Down time
Recovery time
Frequency of failure
Expandability
Agency CIO
Architecture System
Currency & support
2.0
Agency CIO,
Independent
Consultants
3.0
Agency CIO,
Independent
Consultants
3.1
3.2
4
4.0
Link if
Known
Page 82 of 103
Section
Ref:
Performance Group
Mainframe
Description
MFr
Ref No
Sub-Group
Description
Sub
Ref:
4.1
UPS
5.0
Performance
Indicator
Recovery is fast, systematic, well planned and rehearsed, with
minimal system down time.
Primary Source
of PI
Link if
Known
Recovery
Manager
Page 83 of 103
This group of Performance Indicators examines the performance indicators relevant to software development
Sources
Section
Ref:
Performance Group
Description
Software Development
SD
Ref No
1
Sub-Group
Description
Detailed Methodologies
and Corporate Standards
Sub
Ref:
1.0
1.1
Performance
Indicator
Quality Assurance
2.0
2.1
2.2
Performance
3.0
3.1
Testing
4.0
4.1
4.2
Link if
Known
3.2
Primary Source
of PI
Page 84 of 103
Software
applications
development backlog
5.0
5.1
Page 85 of 103
This group of Performance Indicators examines the indicators applicable to storage, backup and data warehousing of
digital information
Sources
ITIL; CobiT
Section
Ref:
Performance Group
Description
Stor
Ref No
Sub-Group
Description
Sub
Ref:
Performance
Indicator
Primary Source
of PI
Performance
1.0
1.1
1.2
1.3
Down time
Frequency of failure
MIPS
cpu usage
Storage
2.0
2.1
2.2
Agency CIO,
2.3
3
Architecture System
Currency & support
3.0
4.0
Link if
Known
Page 86 of 103
Section
Ref:
Performance Group
Description
Stor
Ref No
5
Sub-Group
Description
Back up & Disaster
Recovery Practices
Sub
Ref:
Performance
Indicator
5.0
5.1
UPS
6.0
Primary Source
of PI
Link if
Known
Agency CIO,
Risk Manager
and Disaster
Recovery
Manager
Page 87 of 103
This group of Performance Indicators examines indicators applicable to various forms of online publishing and transaction
processing
Sources
Section
Ref:
Performance Group
Description
e-Gov
Ref No
Sub-Group
Description
Sub
Ref:
1.0
1.2
1.2
1.3
1.4
1.5
1.6
Performance
Indicator
E-Government
E-Government Users
2.0
2.0.1
2.0.2
2.0.3
2.0.4
2.0.5
2.0.6
Indicators:
Geographic
Demographic
Socio-economic
Professional
Industry
Community, educational, cultural, etc.
3.0
3.0.1
Primary Source
of PI
Link if
Known
Page 88 of 103
Section
Ref:
Performance Group
Description
e-Gov
Ref No
Sub-Group
Description
Sub
Ref:
3.0.2
3.0.3
3.0.4
3.0.5
3.0.6
3.0.7
3.0.8
3.0.9
3.0.10
3.0.11
3.0.12
3.1
Performance
Indicator
Short length of contact time
Easy process
User control of activity
Travel avoidance
Queue avoidance
Instant information download in stead of waintinf for mail
Avoid travel cost
Avoid telephone cost
Can consult spouse or family whilst on line as opposed to taking
everyone to an office
Avoid telephone queuing times
Avoid telephone ACD-IVRs (Automated Call Distribution
Interactive Voice Response units)
Community Access. See: Community
4.0
4.1
4.2
4.3
4.4
4.5
4.6
4.7
4.8
Satisfaction
5.0
Use satisfaction
Primary Source
of PI
Link if
Known
Page 89 of 103
Section
Ref:
Performance Group
Description
e-Gov
Ref No
6
Sub-Group
Description
Web Analytics
Sub
Ref:
6.0
Performance
Indicator
There are many comprehensive texts on Web Analytics that can be
found by searching the Internet. One example is
Web Analytics Demystified, by E Peterson. The first 68 pages of the
s250 book are down loadable as a sample
Primary Source
of PI
Peterson
General
Link if
Known
Web
Analytics
Demystified
General
Web
Analytics
Page 90 of 103
PIC15 - Telecommunications
Purpose and Context
Sources
SocITM; IIMA
Section
Ref:
Performance
Group
Com
Description
Ref No
Sub-Group
Description
Telecommunications (Project)
Sub
Ref:
Performance
Indicator
Primary Source
of PI
Link if
Known
Cost of Voice
Network
1.0
1.1
1.2
SocITM KPI 5
2.0
2.1
2.2
SocITM KPI 6
E-govt. Portal
3.0
3.1
3.2
3.4
Number of sites
Hits or traffic
Down loads
Upload / forms completed
LAN
Performance
4.0
4.1
4.4
4.5
CISCO
CISCO
Page 91 of 103
Section
Ref:
Performance
Group
Com
Description
Ref No
Sub-Group
Description
Telecommunications (Project)
Sub
Ref:
4.6
Performance
Indicator
Line use against capacity at peak periods
Primary Source
of PI
IIMA
Internet
5.0
5.1
5.2
5.3
Bandwidth
Speed
Down load and up load capacity.
% of use over capacity in total and at peak
periods.
eMail Protected
Marking
6.0
Facilitation of Protected
classification status
WAN
Performance
7.0
7.1
7.2
7.3
7.4
7.5
7.6
7.7
7.8
7.9
8
Capacity and
Expandable
Capacity
8.0
Markings
for
Link if
Known
IIMA
AGIMO
CISCO
CISCO
Page 92 of 103
Section
Ref:
Performance
Group
Com
Description
Ref No
Sub-Group
Description
Advanced
Technologies &
Methodologies
Telecommunications (Project)
Sub
Ref:
9.0
9.1
9.2
9.3
9.4
9.5
9.6
9.7
Performance
Indicator
Primary Source
of PI
Link if
Known
Page 93 of 103
PIC16 - Security
Purpose and Context
This group of Performance Indicators examines the indicators applicable to ICT security
Sources
Section
Ref:
Performance Group
Security
Description
Sec
Ref No
Sub-Group
Description
Compliance
Security Resource
commitment
Security Management
Risk Management
Sub
Ref:
1.0
Performance
Indicator
Primary Source
of PI
Link if
Known
ITGI
www.ITGI.org
ISACA.org
ITGI
www.ITGI.org
ISACA.org
3.0
3.1
3.2
3.3
3.4
3.5
3.6
ITGI
www.ITGI.org
ISACA.org
4.0
4.1
4.2
4.3
4.4
ITGI
SAI - AS/NZS
4360:1999
Audit Reports
www.ITGI.org
ISACA.org
2.0
2.1
Page 94 of 103
Section
Ref:
Performance Group
Security
Description
Sec
Ref No
Sub-Group
Description
Security Culture
Proactive Testing
Sub
Ref:
5.0
5.1
5.2
Performance
Indicator
Primary Source
of PI
Link if
Known
Awareness
Attitudes
Secure filing policies, procedures and practices.
ITGI
www.ITGI.org
ISACA.org
ITGI
www.ITGI.org
ISACA.org
7.0
7.1
7.2
7.3
Firewall
IDS (intrusion detection systems)
DM2
Antivirus
CIO and
competent
specialist
consultants
User security
8.0
8.1
Communications
Performance / Security
9.0
9.1
9.2
9.3
9.4
9.5
ITGI
10
Business Recovery
10.0
ITGI
11
Legal Compliance
11.0
11.1.0
Legal Counsel
6.0.0
6.0.1
6.0.2
6.0.3
www.ITGI.org
ISACA.org
Page 95 of 103
Section
Ref:
Performance Group
Security
Description
Sec
Ref No
Sub-Group
Description
Sub
Ref:
11.1.1
11.1.2
11.1.3
11.1.4
11.1.5
11.1.6
11.1.7
11.1.8
Performance
Indicator
Primary Source
of PI
Link if
Known
including:
Companies legislation
Contracts, insurance,
Workers Compensation and Industrial,
HR, superannuation, etc.
Employment legislation
Privacy legislation
Accounting process & records
Managerial process and records
12
HR Plan
12.0
Management &
HRM
Stakeholders
13
13.0
Management,
HRM & Security
Stakeholders
14
14.0
Management,
Stakeholders
15
Information Security
15.0
CIO
16
16.0
Emergency and
Fire Consultant
17
Security Quality
Management
17.0
SAI &
ISO 9000
18
18.0
AGIMO
SAI Global
ISO 9000
Page 96 of 103