Tải bản đầy đủ

Analytical methods in software engineering economics


Thomas R. Gulledge
vv illiam P. Hutzler (Eds.)

Analytical Methods
in Software
Engineering Economics
With 45 Figures

Springer-Verlag
Berlin Heidelberg New York
London Paris Tokyo
Hong Kong Barcelona
Budapest


Professor Dr. Thomas R. Gulledge
The Institute of Public Policy
George Mason University
4400 University Drive
Fairfax, VA 22030-4444, USA

Dr. William P. Hutzler
Economic Analysis Center
The MITRE Corporation
7525 Colshire Drive
McLean, VA 22102-3481, USA

ISBN-13: 978-3-642-77797-4
DOl: 10.1007/978-3-642-77795-0

e-ISBN-13: 978-3-642-77795-0

This work is subject to copyright. All rights are reserved, whether the whole or part ofthe material is
concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation, broadcasting, reproduction on microfilms or in other ways, and storage in data banks. Duplication ofthis
publication or parts thereofis only permitted under the provisions ofthe German Copyright Law of
September 9, 1965, in its version of June 24,1985, and a copyright fee must always be paid. Violations
fall under the prosecution act of the German Copyright Law.
© Springer-Verlag Berlin· Heidelberg 1993
Softcover reprint of the hardcover I st edition 1993
The use of registered names, trademarks, etc. in this publication does not imply, even in the absence
of a specific statement, that such names are exempt from the relevant protective laws and regulations and therefore free for general use.

214217130-543210 - Printed on acid-free paper


PREFACE
This volume presents a selection of the presentations
from the first annual conference on Analytical Methods in

Software Engineering Economics held at The MITRE Corporation
in McLean, Virginia. The papers are representative of the
issues that are of interest to researchers in the economics
of information systems and software engineering economics.
The 1990s are presenting software economists with a
particularly difficult set of challenges. Because of budget
considerations, the number of large new software development
efforts is declining. The primary focus has shifted to issues
relating to upgrading and migrating existing systems. In this
environment, productivity enhancing methodologies and tools
are of primary interest.
The MITRE Software Engineering Analysis Conference was


designed to address some of
that face our profession.

new and difficult challenges
The primary objective of the

th,~

conference was to address new theoretical and applications
directions in Software Engineering Economics,

a relatively

new discipline that deals with the management and control of
all segments of the software life-cycle. The discipline has
received

much

visibility

in

the

last

twenty-five

years

because of the size and cost considerations of many software
development and maintenance
Federal Government.

efforts,

particularly

in the

We thank everyone who helped make this conference a
success, especially those who graciously allowed us to
include their work in this volume.
Thomas R. Gulledge
The Institute of Public Policy
George Mason University
Fairfax, Virginia 22030 USA
William P. Hutzler
Economic Analysis Center
The MITRE Corporation
McLean, Virginia 22102 USA


TABLE OF CONTENTS

I.

Plenary Presentation
Economic Analysis of Software Technology
Investments

1

Barry W. Boehm

II.

Software Economics
Measuring the Development Performance of
Integrated Computer Aided Software Engineering
(I-CASE): A Synthesis of Field Study Results
From the First Boston Corporation

39

Rajiv D. Banker and Robert J. Kauffman
Returns-to-Scale in Software Production:
A Comparison of Approaches

75

Patricia E. Byrnes, Thomas P. Frazier, and
Thomas R. Gulledge
An Economics Model of Software Reuse

99

R.D. Cruickshank and J.E. Gaffney, Jr.
Experience With an Incremental Ada
Development in Terms of Progress Measurement,
Built-in Quality, and Productivity

139

Donald H. Andres, Paul E. Heartquist, and
Gerard R. LaCroix
Recognizing Patterns for Software Development
Prediction and Evaluation

Lionel C. Briand, Victor R. Basili, and
William M. Thomas

151


VIII

Calibration of Software cost Models to
DoD Acquisitions

171

Audrey E. Taub
Estimating Software Size From Counts of
Externals: A Generalization of Function
Points

193

J.E. Gaffney, Jr. and R. Werling
CECOM's Approach for Developing Definitions
for Software Size and Software Personnel: Two
Important Software Economic Metrics

205

Stewart Fenick
An Economic Analysis Model for Determining the
Custom Versus Commercial Software Tradeoff

Michael F. Karpowich, Thomas R. Sanders, and
Robert E. Verge

237


Economic Analysis of Software Technology Investments
Barry W. Boehm
Defense Advanced Research Projects Agency
University of California, Los Angeles
Computer Science Department
Los Angeles, CA 90024
1•

Introduction

1.1

Background
Many large organizations are fmding that:
Software technology is increasingly critical to their future organizational



performance.


Organizational expenditures on software are increasing.
Investments in software technology provide opportunities to reduce
software costs and increase organizational performance.

The U.S. Department of Defense (DoD) is one such organization. It has
embarked on the development of a DoD Software Technology Strategy
(SWTS)[Boehm91a] to:


Identify its current and future software technology needs.



Analyze and adjust its current software technology investment portfolio to
better meet DoD needs.



Formulate alternative DoD software technology investment portfolios,
and analyze them with respect to DoD needs and estimated cost savings.

This paper summarizes one of several analyses undertaken to evaluate
alternative DoD software technology investment portfolios. The analysis estimates the
DoD software cost savings likely to result from alternative levels of DoD investment
and calculates the resulting estimated returns on investment (ROI).
The dollar figures used in this paper represent current and proposed alternative
technology investment and savings figures used at one stage of the development of the


2
SWTS. At this point, they are representative of SWTS data and conclusions, but not
necessarily accurate with respect to the fmal figures to be used in the SWTS.

1.2

Overview
The software technology return-on-investment (ROI) analysis presented in this

paper considers three alternative programs:
1.

A Baseline: No significant DoD investments are undertaken to improve
DoD software technology. DoD would continue to benefit at the current
4% rate of software productivity improvement resulting from commercialsector software improvements.

2.

A Current software technology program: Achieving the best results
possible from a program that reflects the current flat-dollar software
technology budgets of most DoD organizations. In then-year dollars, the
Current Program level used in this analysis is around $195M1year
between FY1992 and FY1995. Its level is $192M in FY1996; this
$ 192M is extended for each year between FY1997 and FY2008. In 1992
dollars, the resulting 2008 level of investment would be $88M.

3.

An Achievable software technology program, described in further detail in

the SWTS. This program would: increase the DoD software technology
level of investment from $195M to $410M between FY1992 and
FYl997, and apply a 3% growth factor to this $410M baseline thereafter.
By FY2008, this would be $568M in 2008 dollars and $260M in 1992
dollars, using a 5% deflation rate.
The major questions to be answered by the ROI analysis were:


Can the Current Program be justified with respect to the no-investment
Baseline situation?



Can the Achievable Program be justified with respect to the Current
Program?
Can the Achievable Program or the Current Program realize the SWTS
objective of reducing software unit costs by a factor of two in the year
2000?

The ROI analysis is carried out by estimating a set of 1992-2008 time series
of technology fractions-of-time-used (FTs) and fractional-savings (FSs) resulting from


3
the Current and Achievable software technology programs, and using the model
presented in Section 2 to calculate the resulting cost savings, net present values, and
ROIs.1
Section 2 describes the structure of the SWTS ROI model. Section 3 provides
a summary of the inputs used to calculate the ROI results, with rationales relating the
choice of ROI input quantities to the expected stream of software technology results
produced by the Current and Achievable Programs. Section 4 presents and discusses
the resulting estimated DoD software cost savings and ROI results. Section 5 presents
the resulting conclusions.

2•

SWTS ROI Model
The SWTS ROI model begins by computing the estimated cost savings

resulting from three major sources:

"work avoidance" through software reuse

technology improvements; "working smarter" through process technology
improvements; and "working faster" through improvements in software tools and
environments. These cost savings are calculated for both development and maintenance
from the years 1992 through 2008.2
Achieving the end results of the Current and Achievable software technology
programs requires investment in new software technologies to achieve cost savings.
To assess the potential worth of these investments, two financial measures of merit are
computed. One measure is the ROI mentioned above. The other measure is net
present value (NPV). Both measures account for the time value of money.

1For a complete definition of net present value and ROI, see section 2.4.
2This analysis has been automated in a model using Microsoft Excel™ macros by The
Institute for Defense Analyses. The resulting tool provides a set of pull-down menus
that allow the user to rapidly change a number of assumptions underlying the analysis
and obtain graphic output of the resulting savings. See [BOEHM91b].


4

The remainder of this section describes the model structure and parameters,
shows algebraic representations of how cost savings are calculated, and provides
examples of how the formulas are applied to the alternative SWTS programs.

2. 1

Model Structure and Parameters
The Baseline scenario used as the starting point of the analysis represents the

estimates of the annual level of DoD software expenditure in the absence of any
significant DoD software technology investment The analysis assumes that for the
initial year, 1992, DoD software expenditure will be $24 billion (B). This estimate is
conservative: 1990 estimates have ranged from $24B to $32B. The analysis also
assumes that this number will increase over time at a 5% rate through the year 2008.
This growth rate was calculated assuming that the annual DoD software output will be
reduced to a growth rate of 4% by DoD budget limitations. This demand growth would

be absorbed by improvements in commercial software technology, which are likely to
continue to produce 4% annual productivity gains. Thus, the Baseline scenario
represents a constant DoD software work force level; the 5% cost growth rate results
from an assumed 5% inflation rate. This estimate is also very conservative. The
analysis assumes that the distribution of this expenditure between development and
maintenance is 30% for development and 70% for maintenance. Using the information
from above, the 1992 baseline would be $7.2B for development and $16.8B for
maintenance.
Table 1 below summarizes these parameters and the sources upon which th~y
are based.
The estimated effects of the Current and Achievable DoD technology programs
are calculated by adjusting the original cost baseline by annual estimates of the cost
saving effects of work avoidance, working smarter, and working faster on both software
development and maintenance. Note that the manner in which the baseline costs are
computed implies that there will be a 4% gain in productivity whether any of the
initiatives are employed or not. The adjustments to the baseline through work
avoidance, working smarter, and working faster are in addition to such "natural"
productivity trends.


5
Table 1:

Eillametet Cate2QO::
Total DoD 1992 Software
Spending
DevelopmentlMaintenance Split

Baseline Parameters

ElllameteI VaXlle QI Name
$24 billion
30% Development,
70% Maintenance

~

ElA90 ($32B)
AVWK91 ($3IB)
BOEHM:81,
EIA90

Growth Rates:
DoD SWCost

5%

AVWK91 (7%)
ElA90 (1%)

Productivity Growth

4%

MARTIN83,
LEVlTAN88

Inflation rate

5%

As noted above, the analysis identifies three sources of cost savings caused by
the Initiative; these are formally described as end product cost avoidance (EPCA) ,
process cost avoidance (PCA), and process cost reduction (PCR). EPCA represents
cost savings from avoiding the need to write more lines of code: via software reuse,
application generators, commercial off-the-shelf (COTS) software, Ada generics, and
other product-related improvements. PCA represents savings from process-related
improvements that enable projects to avoid costly rework by "working smarter."
Examples of PCA technology improvements are proto typing and risk management
technology, and those captured by the Software Engineering Institute (SEI) software
process assessment.

PCR represents savings from improvements in software

engineering environments (SEEs) and better, more interoperable tools that partially
automate software development and maintenance, enabling people to "work faster" on
those core portions of the software process remaining, after one has eliminated
avoidable product efforts via EPCA and avoidable process efforts via PCA.
Table 2 summarizes the sources of savings used in this analysis.


6
Table 2:
~

Savings Sources

EormalNam~

Abb~yiatiQD

Cbar&:teriSUI;;

EPCA

reuse

work avoidance

end product cost
avoidance

working smarter

process cost avoidance

PCA

rework avoidance

working faster

process cost reduction

PCR

tools & environments

The analysis is divided into a development and maintenance portion. Cost
savings are determined by the multiplicative product of the fraction of time the
improvements are used and the fraction of savings realized when the improvements are
used.
2.2

Calculating Development Savings
As noted above, the analysis estimates the savings resulting from software

technology improvements for the years 1992 to 2008. For each year and source of
savings (EPCA, PeA, and PCR for both development and maintenance), some value
for FS from the use of technologies and some FT value are postulated. The proportion
of cost avoidance caused by a given source of savings in a given year is calculated by
multiplying FS by FT. The product of FT and FS is then subtracted from 1 and the
result is multiplied by the annual baseline cost to give the annual cost under a software
technology improvement program.
An example may make this computation clear. If one were to estimate the FS

from EPeA in the year 2000 to be 80% and the fraction of software reused rather than
being developed (FT) to be 12%, the resulting savings would be 0.80

* 0.12 =

0.096

or 9.6%. Subtracting this value from 1 gives 0.904, which could be thought of as a
residual cost fraction (RCF), the fraction of costs left after avoidable end-product costs
have been eliminated. Using the baseline development cost for the year 2000, which is
computed (assuming 5% compounded growth from a 1992 value of $7.2B) to be
$10.6B, the new costs would be $10.6B

* 0.904 = $9.6B.

This means $1B of savings

from development reuse or EPeA would come in the year 2000. Similar calculations
would be applied sequentially for PeA and PeR savings. For example, the FT and FS


7

for PCA in 2000 are estimated to be 75% and 16%, respectively. Thus 0.75
0.12 or 12%. The RCF would be 1 - 0.12

=

* 0.16 =

0.88. Applying the RCF to the $9.6B

yields $8.45B. Similarly for the PCR, FT and FS for 2000 are 75% and 7%,
respectively. The RCF is calculated to be 0.948. Applying this to the $8.45B yields
$8.0lB. The difference between the baseline development estimate, $10.6B, and the
estimated savings from all of the three sources, $8.01B, is the total dollar development
savings, in this case, $2.59B in 2000.
The example above is summarized in Table 3.

Table 3:

Algebraic Example of Year-2000 Development EPCA
Savings
RCF

ADC

RADC

EPCA

0.904 = 1 (0.8 * 0.12)

$1O.6B

$9.6B = $10.6B
* 0.904

$1.0B = $10.6B - $9.6B

PCA

0.88 = 1 (0.75 * 0.16)

$9.6B

$8.45B = $9.6B
* 0.88

$1.15B = $9.6B - $8.45B

PCR

0.948 = 1 (0.75 * .07)

$8.45B

$8.0lB = $8.45B
* 0.948

$0.44B = $8.45B - $8.0lB

Catego!!

ADS

$2.59B

Total
ADS

Notes: ADS = annual development savings. RADC = residual annual
development cost. ADC = annual development software cost. RCF = residual
cost fraction; computed as 1 - (FT x FS) for each component of savings.
ADS = ADC - RADC. RADC = ADC x RCF.

2.3

Calculating Maintenance Savings
The analysis also estimates the savings for maintenance resulting from

software technology improvements for the years 1992 to 2008. For each year, FTs and
FSs are estimated. The technologies and processes that cause these savings are listed
below.
EPCA: (1) use of COTS and (2) the megaprogramming technology
described in the SWTS: Ada generics, domain-specific software
architectures (DSSAs), module composition technology, application
generators.


8
PCA:
(1) improved maintenance process and (2) improved
understandability of software.
PCR: (1) increased use of tools and environments and (2) better
structured, easy-to-modify software.
Table 4 presents a similar algebraic example of the maintenance savings for EPCA in
the year 2000. The Baseline annual maintenance software cost is computed to be
$24.8B; the three sources of software technology savings reduce this to $19.IB, for a
total savings of $5.7B.

Table 4:

Catego!l:

Algebraic Example of Year-2000 Maintenance EPCA
Savings
RCF

AMC

RAMC

AMS

EPCA

0.872 = 1
(0.16 * 0.8)

-

$24.8B

$21.6B = $24.8B
* 0.872

$3.2B
$21.6B

PCA

0.91 = 1 - (0.65
* 0.14)

$21.6B

$19.7B = $21.6B
* 0.91

$1.9B = $21.6B $19.7B,

PCR

0.97 = 1 - (0.7
* 0.05)

$19.7B

$19.1B = $19.7B
* 0.97

$0.6B = $19.7B $19.1B

Total
AMS

$24.8B -

$5.7B
Notes: AMS = annual maintenance savings. RAMC = residual annual
maintenance cost AMC = annual maintenance software cost. RCF = residual
cost fraction, computed as 1 - (FT x FS) for each component of savings.
AMS = AMC - RAMC. RAMC = AMC x RCF.

2.4

Calculating ROI and NPV
To achieve the software development and maintenance cost savings discussed

above, a substantial investment by the DoD would be required. To assess the potential
worth of such investments, two financial measures of merit are computed. One
measure is the ROI. The other measure is NPV. Both measures are calculated from


9

constant dollars and account for time value of money by "discounting" the benefits (in
this case the DoD cost savings) and the costs (i.e., the DoD investment)}
The formula used in the NPV computation can be shown as:

NPV

where
St

the cost savings for year t.

Ct

the dollar value of the investment in year t.

d

the discount rate.

m

the number of years over which the calculations are made.
In this case, m = 16, and t = 0 corresponds to the year 1992.

To be consistent with OMB guidelines [OMB72], we assume the discount rate
to be 10%. The resulting NPV figure is the present value (or worth today) of the

stream of savings derived from the stream of investments made over the period of this
analysis.
The ROI computation also is closely related to the NPV figure. The ROI
measure is the ratio of the discounted savings to the discounted costs. Algebraically
this can be shown as:

ROI=

3Constant dollars are used so that, after adjusting for inflation, a dollar in the future has
the same purchasing power as a dollar in the present. Discounted dollars are used so
that, after discounting, a future dollar has the same value to us now as does a dollar in
the present.


10
where the variables are defmed as above.
The ROI figure used in this analysis is interpreted as the return for a dollar of
invesbnent when adjusted for price-level changes and the time value of money. For
example, if the ROI is computed to be 6, then this figure suggests that for every
constant, time-discounted dollar invested by the government, 6 constant, timediscounted dollars in savings will be returned.

3•

Inputs to the Return on Investment Analysis
This section presents the input estimates used in the ROI analysis and the

rationales for the numerical values estimated. As the ROI model is automated with
adjustable parameters, the effect of alternative estimates can readily be calculated. The
input estimates discussed below are:

3.1

1.

Reuse (EPCA) inputs.

2.

Working-smarter (PCA) inputs.

3.

Working-faster (PCR) inputs.

4.

DoD Baseline software costs.

5.

Current and Achievable software technology invesbnent levels.

Reuse (End Product Cost Avoidance) Inputs
The reuse fraction of time FT (EPCA) represents the fraction of DoD software

reused across all DoD software development or maintenance activities that would
otherwise have involved developing or modifying code at the Ada or COBOL level, or
below (e.g., assembly code). As discussed in [BOEHM81] and elsewhere, there are
various levels of-reuse of code, specifications, and other software artifacts that lead to
different levels of savings. For this analysis, FT is defined to cover only situations
where essentially all code in a module is reused, or where coding is avoided by using
very high level languages (VHLLs) or application generators.
For module reuse, extensive measured experience in the NASA Software
Engineering Laboratory has indicated that the average savings fraction FS (EPCA) is


11
o.S [BASILISl, SEIDS9]. Savings from VHLLs or application generators have
typically varied from 0.5 to 0.9, depending primarily on the maturity of the
technology.
Development. Early gains will come primarily from use of commercial
off-the-shelf (COTS) software, particularly in the Corporate Information Management
(CIM) area. In the mid-90s, module reuse supported by process improvements and
repositories (e.g., STARS, RAPID) will boost reuse. In the late 90s, major gains will
begin from investments in DoD DSSAs and early module-composition
megaprogramming technology. At this point, gains from the reduced-effort Current
Program begin to tail off, while gains from the Achievable Program are estimated to
increase. These comparative trends are estimated to continue through about 2OO3-200S,
as the additional megaprogramming technology in the Achievable Program matures.
Some factors, particularly cultural inertia and the rapid technical change in underlying
computer architectures will serve as retardants to progress toward complete, reuse-based
software development
The resulting estimated development EPCA time series are as follows:
Table 5:
Current

Estimated Development EPCA Time Series

Pr~ram

1992

l22i

.!22§.

.!22!

FT(EPCA)

.005

.02

.05

.OS

.12

.15

.18

.20

~
.22

FS (EPCA)

.70

.75

.78

.80

.80

.80

.80

.80

.80

1992

1994

1996

.!22!

2000

2002

2004

2006

2008

.005

.02

.06

.12

.20

.30

.40

.47

.52

.70

.75

.78

.80

.82

.84

.86

.87

.88

Achievable
Program
FT(EPCA)
FS (EPCA)

1QQQ. ~ 1QQi ~

Maintenance. Maintenance reuse savings will come from two primary
sources:


12
Use of COTS software: the net savings will be the difference between the
amount of non-COTS modification that would otherwise have been
needed, and the COTS maintenance fees.
Modification avoidance via megaprogramming technology: initially Ada
generics and similar capabilities, and eventually maintenance via module
replacement based on DSSA and module-composition capabilities, plus
life-cycle gains from VHLLs and application generators. These
modularity-based savings come both from reengineering existing software
and introduction of newly developed module-based software into the
downstream maintenance inventory.
Estimated gains in the early 90s come primarily from replacement of DoDunique software inventory by COTS, particularly in the CIM area.

As in the

development phase, estimated maintenance gains in the late 90s and 2000s become
larger for the Achievable Program than for the Current Program, because of the
stronger DSSA, VHLL, application generator, and module composition capabilities
made available to DoD via the Achievable Program.
The resulting estimated maintenance EPCA time series are as follows:

Table 6:

Estimated Maintenance EPCA Time Series

Current
Pro![am
Fr (EPCA)

~
.02

1994

1996

.!22§.

2000

2002

2004

1QQ2.

2008

.061

.085

.12

.16

.20

.22

.24

.26

FS (EPCA)

.70

.75

.78

.80

.80

.80

.80

.80

.80

Achievable
Pro![am
Fr (EPCA)

1992

1994

1996

1998

2000

2002

2004

2006

2008

.02

.071

.12

.18

.25

.32

.40

.48

.56

FS (EPCA)

.70

.75

.78

.80

.81

.82

.83

.84

.85

3.2

Working-Smarter (Process Cost Avoidance) Inputs
A quantitative understanding of the distribution of cost across the various

activities involved in the software process is crucial to estimating process cost savings,


13
both for process cost avoidance (PeA) and process cost reduction (PCR). The analysis
below is based on a value-chain analysis [pORTER80] of typical process cost
distributions, based on a sample of 40 DoD software projects [BOEHM88]. The
potential effects of process and tool improvements on each of the canonical software
development and maintenance activities (requirements analysis, prototyping, design,
etc.) are estimated below, based on their initial value-chain cost distributions.
Table 7 shows the results of the analysis for software development. The first
column shows the current cost distribution across development activities: 4% of
current costs (assuming an overall system design as starting point) go to software
requirements analysis, while 15% goes to coding and related activities such as unit test.
Columns two and three show the potential effects of working-smarter process
improvements. The effort devoted to requirements analysis is increased from 4% to
6%, while the effort devoted to coding activities is decreased from 15% to 7% (reduced
rework caused by better requirements and design, reduced project turbulence because of
better, pre-verified interface definitions, and reduced make-work such as full-scale
critical design review).
Columns four and five show the potential effects of tools and environmental
support to make the remaining essential work go faster. For requirements analysis,
better modeling and specification tools could reduce the 6% figure to 5%. For codingrelated activities, better tools for automating portions of the coding and unit testing
process, and better support of group-coordination and change-effect processing could
reduce the 7% figure to 5%. The final column shows the resulting normalized cost
percentages: both requirements analysis and code would consume 11 % of the reduced
total cost.
The total potential working-smarter (or PCA) savings is thus 37% of the
original total. This figure is consistent with [JONES86], which has rework costs
increasing to 50% for very large projects. The subsequent potential working-faster (or
PCR) savings is 30% of the post-PCA costs or 19% of the original total. This figure
is conservative with respect to the 33%-50% productivity gains for tool and


14
environment support in software cost estimation models such as the Constructive Cost
Model (COCOMO) and Ada COCOMO [BOEHM81, BOEHM89].
Table 8 shows the counterpart cost breakdown and potential working-smarter
(PCA) and working-faster (PCR) savings for maintenance. Overall potential PCA
savings are 39%; PCR potential savings are 31 % of the remainder, or 19% of the
original total.

Table 7:

Potential Software Development Savings - Value
Chain Analysis
Current
WorkRemainder
WorkOverall
Revised
Cost %
Smarter
Faster
Remainder
Cost%
Savings
Savings

Activity

Rqts.
Analysis
Prototyping
Rqts. Trace
Design

Code
Integration
& Test
Documentation
Config.
Mgmt.
Managemen

4

+2

6

-1

5

11

3
4

+2
-1
-1

-1
-1
-3
-2
-1

4

9

-8
-8

5
3
11
7
6

15

-8

7

5

-1

16
12

12
15
14

2

4

8

18

5
5

11
11

-3

4

9

4

-2

2

4

-8

8

-2

6

14

-6

6

-3

3

7

t

Other*
Total

44
100
-19
(30% of
63)
* "Other" includes project communications, quality assurance functions, training
functions, security management, simulation.

100

-37

63

Development Process Cost A voidance. The fraction of time process
improvements are used, FT (PCA), is estimated as the fraction of the DoD software
performer base that has improved itself at least one level on the five-level Software
Engineering Institute (SEI) process maturity assessment scale [HUMPHREY89].


15
Most contractors and internal DoD software organizations are still at Level 1. The
prospect of using maturity level as a contractor source selection criterion, or recent
directives for internal CIM organizations to use SEI assessments [STRASSMANN91],
will cause relatively rapid early increases in Ff (PCA). However, cultural inertia will
still leave some DoD software projects at Levell in the year 2008.

Table 8:
Activity

Rqts.
Analysis
Prototyping
Rqts. Trace
Design
Code
Integration
& Test
Documentation
Config.
Mgmt.
Management
Other*
Total

Potential Software Maintenance Savings - Value Chain
Analysis
Current
Cost %

WorkSmarter
Savings

Remainder

WorkFaster
Savings

Overall
Remainder

Revised
Cost%

6

0

6

-1

5

12

0

+2

2

0

2

5

2
14
20

+1
-3
-7
-10

3
8
7
10

-1
-2
-2
-3

2
6
5
7

5
14
12
17

16

-9

7

-3

4

9

4

0

4

-2

2

5

15

-7

8

-2

6

14

12

-6

6

-3

3

7

100

-39

61

11

-19
42
100
(31% of
61)
"Other"
includes
project
communications,
quality
assurance functions, training
*
functions, security management, simulation.
The fractional savings FS (PCA) from working smarter is estimated as a

function of the average number of process maturity levels that organizations developing
DoD software have progressed. Improving one level of maturity is estimated to
produce an 0.14 savings fraction; two levels, 0.24; three levels, 0.32; and four levels,


16
0.37. The SEI is collecting data to provide better quantitative information on the
effects of process maturity on software costs.
The resulting estimated development PCA time series are given in Table 9.
The Ff series are the same for the Current and Achievable Programs, as process
maturity adoptions are primarily a function of DoD management initiatives. The FS
are estimated to be considerably higher in the out-years for the Achievable Program,
because of significantly better technology support of tailorable process models, process
programming, prototyping, and knowledge-based risk management aids.

Table 9:

Estimated Development peA Time Series

Current
Pro8!:am
Ff (PCA)

.!221

1994

1996

.05

.25

.50

.65

FS (PCA)

.12

.13

.14

1992

1994

.05
.12

Achievable
Pro8!:am
Ff (PCA)
FS (PCA)

.!22! 2QQQ.

1QQ2. ~

2002

1QQi

.75

.80

.85

.89

.92

.15

.16

.18

.20

.22

.24

1996

1998

2000

2002

2004

2006

2008

.25

.50

.65

.75

.80

.85

.89

.92

.14

.16

.20

.24

.27

.30

.32

.34

Maintenance. Estimated FT adoption rates for maintenance process
improvements show significant increases similar to those for development. The
maintenance rates are somewhat lower, since maintenance processes are more difficult
to decouple from their large inventories of existing software. The estimated FS
rework-avoidance savings are lower than development savings for similar reasons, but
this is compensated for by technology contributions to process cost avoidance.
Software understanding and reengineering technology will avoid much of the cost in
software maintenance currently devoted to the process of understanding poorly
structured and poorly explained software. This cost is estimated to be as high as 47%
of the maintenance effort [PARIKH83]. Improving one level of process maturity is
estimated to produce a combined rework-avoidance and understanding-improvement
savings fraction of 0.10; two levels, 0.18; three levels, 0.26, and four levels, 0.34.


17
As with development PCA, the FS estimates for the Achievable Program are
considerably higher in the out-years than for the Current Program, because of
significantly better technology support for software understanding and reengineering.
The resulting maintenance PCA time series are given below.
Table 10:
Current
Program
Ff (PeA)

~
.05

FS (PCA)
Achievable
Program
FT (PeA)
FS (PCA)

3.3

Estimated Maintenance PCA Time Series

.!22! .!222. .!22§. lQQQ.

2002

2QQi 1QQ2.

2008

.20

.40

.55

.65

.70

.75

.80

.84

.10

.10

.11

.12

.14

.16

.17

.18

.19

1992

.!22!

1996

1998

2000

2002

2004

2006

2008

.05

.20

.40

.55

.65

.70

.75

.80

.84

.10

.11

.13

.16

.20

.25

.30

.35

.40

Working-Faster (Process Cost Reduction) Inputs
The fraction of time PCR tools and environments are used FT (PCR) is

estimated as the fraction of the DoD software performer base that has improved itself at
least one level on an ascending computer-aided software engineering (CASE) maturity
hierarchy for tools and environment support. The maintenance PCR savings are also
enhanced by re-engineering technology improvements and by better-structured software
entering the maintenance inventory.

The CASE maturity levels and their

corresponding savings fractions FS (PCR) for development and maintenance are given
in Table 11.
Table 11:

Levels of Tool and Environment Support

CASE Maturity Level

1.
2.
3.
4.
5.

Minimal
1991 CASE Tools
Integrated CASE Environment
Integrated, Fully-Populated CASE Environment
Proactive CASE Environment

Development
FS (PCR)
0.00
0.07
0.14
0.23
0.30

Maintenance
FS (PCR)
0.00
0.06
0.14
0.24
0.32


18
The savings fractions at the lower levels are low because CASE tools are
frequently purchased and installed without the associated tailoring, training, and process
integration needed to make them payoff.

In some situations, indiscriminate

purchasing of CASE tools has actually reduced productivity.
Development and Maintenance.

The resulting development and

maintenance PCR time series are given in Tables 12 and 13. The comparisons
between the Current Program and Achievable Program are similar to those for PCA;
the estimated Fr (PeR) adoption rates are the same, while the estimated FS (PCR)
savings fractions are considerably higher in the out-years because of the significantly
higher levels of DoD-responsive advanced CASE technology.
The FS (PeR) are net savings, which have reduced the gross savings by 0.05,
reflecting the typical 5% added to the cost of doing business for the purchase,
amortization, and maintenance fees for CASE tools and workstations. Thus, the 1992
development savings fraction is not 0.07, as might be expected from Table 11, but
rather 0.02.
Table 12:
Current
Pro![am
Fr (PCR)

FS (PCR)
Achievable
Pro![am
Fr (PCR)
FS (PCR)

3.4

~

Estimated Development PCR Time Series

.!22i .!222.

1998

2QQQ. 2002

2004

2006

2008

.15
.02

.35
.04

.50
.05

.65
.06

.75
.07

.80
.08

.85
.09

.89
.10

.92
.11

~
.15

1994

1996

1998

2000

2002

2004

2006

2008

.35

.50

.65

.75

.80

.85

.89

.92

.02

.04

.07

.11

.15

.18

.21

.23

.25

DoD Baseline Software Costs
The DoD baseline software cost profile from which savings are calculated is

that discussed in Section 2.1, in which no significant DoD efforts are undertaken to
improve DoD software technology. Past experience indicates that one would expect a


19
4% per year general improvement in software productivity to apply to the DoD. The
Baseline scenario limits DoD software demand growth to 4%.
Table 13:
Current

Estimated Maintenance PCR Time Series

.!221 .!22i

1996

.!22!

2000

2004

2006

2008

.80

.84

.87

Ff (PCR)

.10

.30

.45

.60

.70

~
.75

FS (PCR)

.01

.02

.03

.04

.05

.06

.07

.08

.09

Program

1992

1994

1996

1998

2000

2002

2004

2006

2008

Ff (peR)

.10

.30

.45

.60

.70

.75

.80

.84

.87

FS (PCR)

.01

.03

.06

.09

.13

.17

.21

.24

.27

Program

Achievable

As discussed in Section 2.1, it was assumed that the DoD will spend $24B in
1992 for software and that this $24B can be separated into development (30%) and
maintenance (70%). A 5% inflation rate is also assumed, yielding a net growth in
DoD software expenditures of 5% per year compounded over the period of the analysis.
The results are shown below in Table 14.

Table 14:

Baseline Estimates of DoD Software Expenditures
(Billions of Then-Year Dollars)

$B

.l.222

1m

.l22Q

.l22a

Total DoD

Software

$24.
0

26.5

29.2

Mainte-

16.8

18.5

20.4

nance

2QOO

2QQ2

2QM

2006

~

32.2

35.5

39.1

43.1

47.5

52.4

22.5

24.8

27.4

30.2

33.3

36.7

Develop7.2
7.9
8.8
9.6
10.6
11.7
12.9
14.3
15.7
ment
Note: In this table, as with all tables that report spreadsheet results, the columns do not
always add or subtract exactly because of rounding.
To account for the price-level changes over time, the estimates of savings and
investments have been deflated to constant 1992 dollars. Hereafter, the results of the
analyses will be presented in both then-year and constant dollars.


Tài liệu bạn tìm kiếm đã sẵn sàng tải về

Tải bản đầy đủ ngay

×