Professional Documents
Culture Documents
SoftwareCostEstimationMetricsManual
Software Cost
Estimation Metrics
Manual
AnalysisbasedondatafromtheDoDSoftware
ResourceDataReport
Thismanualdescribesamethodthattakessoftwarecostmetricsdataand
createscostestimatingrelationshipmodels.Definitionsofthedatausedinthe
methodologyarediscussed.ThecostdatadefinitionsofotherpopularSoftware
CostEstimationModelsarealsodiscussed.ThedatacollectedfromDoDs
SoftwareResourceDataReportareexplained.Thestepsforpreparingthedata
foranalysisaredescribed.Theresultsofthedataanalysisarepresentedfor
differentOperatingEnvironmentsandProductivityTypes.Themanualwraps
upwithalookatmodernestimatingchallenges.
UNCLASSIFIED
DistributionStatementA:Approvedforpublicrelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Contents
1 Introduction........................................................................................................................................1
2 MetricsDefinitions.............................................................................................................................2
2.1 SizeMeasures.............................................................................................................................2
2.2 SourceLinesofCode(SLOC)...................................................................................................2
2.2.1 SLOCTypeDefinitions.........................................................................................................2
2.2.2 SLOCCountingRules...........................................................................................................3
2.2.2.1 LogicalLines...................................................................................................................3
2.2.2.2 PhysicalLines.................................................................................................................4
2.2.2.3 TotalLines.......................................................................................................................4
2.2.2.4 NonCommentedSourceStatements(NCSS)............................................................4
2.3 EquivalentSize...........................................................................................................................5
2.3.1 DefinitionandPurposeinEstimating................................................................................5
2.3.2 AdaptedSLOCAdjustmentFactors...................................................................................6
2.3.3 TotalEquivalentSize............................................................................................................7
2.3.4 Volatility.................................................................................................................................7
2.4 DevelopmentEffort...................................................................................................................7
2.4.1 ActivitiesandLifecyclePhases............................................................................................7
2.4.2 LaborCategories....................................................................................................................8
2.4.3 LaborHours...........................................................................................................................9
2.5 Schedule......................................................................................................................................9
3 CostEstimationModels..................................................................................................................10
3.1 EffortFormula..........................................................................................................................10
3.2 CostModels..............................................................................................................................10
3.2.1 COCOMOII.........................................................................................................................11
3.2.2 SEERSEM............................................................................................................................12
3.2.3 SLIM......................................................................................................................................12
3.2.4 TrueS....................................................................................................................................13
3.3 ModelComparisons.................................................................................................................13
3.3.1 SizeInputs............................................................................................................................13
3.3.1.1 COCOMOII..................................................................................................................14
3.3.1.2 SEERSEM.....................................................................................................................14
3.3.1.3 TrueS.............................................................................................................................16
3.3.1.4 SLIM...............................................................................................................................19
3.3.2 Lifecycles,ActivitiesandCostCategories.......................................................................19
4 SoftwareResourceDataReport(SRDR).......................................................................................23
UNCLASSIFIED
DistributionStatementA:Approvedforpublicrelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
4.1 DCARCRepository..................................................................................................................23
4.2 SRDRReportingFrequency....................................................................................................24
4.3 SRDRContent...........................................................................................................................25
4.3.1 AdministrativeInformation(SRDRSection3.1).............................................................25
4.3.2 ProductandDevelopmentDescription(SRDRSection3.2)..........................................26
4.3.3 ProductSizeReporting(SRDRSection3.3).....................................................................27
4.3.4 ResourceandScheduleReporting(SRDRSection3.4)..................................................28
4.3.5 ProductQualityReporting(SRDRSection3.5Optional)............................................28
4.3.6 DataDictionary....................................................................................................................29
5 DataAssessmentandProcessing...................................................................................................30
5.1 Workflow...................................................................................................................................30
5.1.1 GatherCollectedData.........................................................................................................30
5.1.2 InspecteachDataPoint......................................................................................................31
5.1.3 DetermineDataQualityLevels.........................................................................................33
5.1.4 CorrectMissingorQuestionableData.............................................................................34
5.1.5 NormalizeSizeandEffortData.........................................................................................34
5.1.5.1 ConvertingtoLogicalSLOC.......................................................................................34
5.1.5.2 ConvertRawSLOCintoEquivalentSLOC..............................................................37
5.1.5.3 AdjustforMissingEffortData...................................................................................39
5.2 DataSegmentation...................................................................................................................39
5.2.1 OperatingEnvironments(OpEnv)....................................................................................40
5.2.2 ProductivityTypes(PT).....................................................................................................41
5.2.2.1 FindingtheProductivityType...................................................................................44
6 CostEstimatingRelationshipAnalysis.........................................................................................46
6.1 ApplicationDomainDecomposition.....................................................................................46
6.2 SRDRMetricDefinitions.........................................................................................................46
6.2.1 SoftwareSize........................................................................................................................46
6.2.2 SoftwareDevelopmentActivitiesandDurations...........................................................46
6.3 CostEstimatingRelationships(CER)....................................................................................48
6.3.1 ModelSelection...................................................................................................................48
6.3.2 ModelBasedCERsCoverage............................................................................................49
6.3.3 SoftwareCERsbyOpEnv..................................................................................................50
6.3.3.1 GroundSite(GS)OperatingEnvironment...............................................................50
6.3.3.2 GroundVehicle(GV)OperatingEnvironment........................................................51
6.3.3.3 AerialVehicle(AV)OperatingEnvironment...........................................................52
6.3.3.4 SpaceVehicleUnmanned(SVU)OperatingEnvironment....................................53
6.3.4 SoftwareCERsbyPTAcrossAllEnvironments.............................................................53
UNCLASSIFIED
DistributionStatementA:Approvedforpublicrelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
6.4 ProductivityBenchmarks........................................................................................................56
6.4.1 ModelSelectionandCoverage..........................................................................................56
6.4.2 DataTransformation...........................................................................................................57
6.4.3 ProductivityBenchmarkStatistics....................................................................................58
6.4.4 SoftwareProductivityBenchmarkResultsbyOperatingEnvironment.....................58
6.4.5 SoftwareProductivityBenchmarksResultsbyProductivityType..............................59
6.4.6 SoftwareProductivityBenchmarksbyOpEnvandPT..................................................61
6.5 FutureWork..............................................................................................................................61
7 ModernEstimationChallenges......................................................................................................62
7.1 ChangingObjectives,ConstraintsandPriorities.................................................................62
7.1.1 RapidChange,EmergentRequirements,andEvolutionaryDevelopment................62
7.1.2 NetcentricSystemsofSystems(NCSoS).........................................................................65
7.1.3 ModelDrivenandNonDevelopmentalItem(NDI)IntensiveDevelopment...........65
7.1.4 UltrahighSoftwareSystemsAssurance...........................................................................66
7.1.5 LegacyMaintenanceandBrownfieldDevelopment......................................................67
7.1.6 AgileandKanbanDevelopment.......................................................................................68
7.1.7 PuttingItAllTogetherattheLargeProjectorEnterpriseLevel..................................68
7.2 EstimationApproachesforDifferentProcesses..................................................................69
8 ConclusionsandNextSteps...........................................................................................................73
9 Appendices.......................................................................................................................................74
9.1 Acronyms..................................................................................................................................74
9.2 AutomatedCodeCounting....................................................................................................79
9.3 AdditionalAdaptedSLOCAdjustmentFactors..................................................................79
9.3.1 Examples...............................................................................................................................81
9.3.1.1 Example:NewSoftware..............................................................................................81
9.3.1.2 Example:ModifiedSoftware......................................................................................81
9.3.1.3 Example:UpgradetoLegacySystem........................................................................81
9.4 SRDRDataReport....................................................................................................................82
9.4.1 ProposedModifications......................................................................................................86
9.5 MILSTD881CWBSMappingtoProductivityTypes........................................................88
9.5.1 AerialVehicleManned(AVM)..........................................................................................88
9.5.2 OrdinanceVehicleUnmanned(OVU).............................................................................90
9.5.3 OrdinanceVehicleUnmanned(OVU).............................................................................92
9.5.4 MaritimeVesselManned(MVM).....................................................................................93
9.5.5 SpaceVehicleManned/Unmanned(SVM/U)andGroundSiteFixed(GSF)............94
9.5.6 GroundVehicleMannedandUnmanned(GVM/U)......................................................95
9.5.7 AerialVehicleUnmanned(AVU)&GroundSiteFixed(GSF)......................................97
UNCLASSIFIED
DistributionStatementA:Approvedforpublicrelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
9.5.8 MaritimeVesselUnmanned(MVU)andMaritimeVesselManned(MVM)..............98
9.5.9 OrdinanceVehicleUnmanned(OVU).............................................................................99
9.5.10 GroundSiteFixed(GSF)...................................................................................................100
9.5.11 AppliestoALLEnvironments.........................................................................................100
9.6 Productivity(Pr)BenchmarkDetails..................................................................................101
9.6.1 NormalityTestsonProductivityData...........................................................................101
9.6.1.1 OperatingEnvironments(allProductivityTypes)................................................101
9.6.1.2 ProductivityTypes(allOperatingEnvironments)................................................102
9.6.1.3 OperatingEnvironmentProductivityTypeSets................................................102
9.6.2 StatisticalSummariesonProductivityData..................................................................102
9.6.2.1 OperatingEnvironments...........................................................................................103
9.6.2.2 ProductivityTypes.....................................................................................................108
9.6.2.3 OperatingEnvironmentProductivityTypeSets.................................................114
9.7 References................................................................................................................................129
Acknowledgements
TheresearchandproductionofthismanualwassupportedbytheSystemsEngineering
ResearchCenter(SERC)underContractH9823008D0171andtheUSArmyContracting
Command,JointMunitions&LethalityCenter,JointArmamentsCenter,PicatinnyArsenal,NJ,
underRFQ663074.
Manypeopleworkedtomakethismanualpossible.Thecontributingauthorswere:
CherylJones,USArmyArmamentResearchDevelopmentandEngineeringCenter
(ARDEC)
JohnMcGarry,ARDEC
JosephDean,AirForceCostAnalysisAgency(AFCAA)
WilsonRosa,AFCAA
RayMadachy,NavalPostGraduateSchool
BarryBoehm,UniversityofSouthernCalifornia(USC)
BradClark,USC
ThomasTan,USC
UNCLASSIFIED
DistributionStatementA:Approvedforpublicrelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
1 Introduction
Estimatingthecosttodevelopasoftwareapplicationisdifferentfromalmostanyother
manufacturingprocess.Inothermanufacturingdisciplines,theproductisdevelopedonceand
replicatedmanytimesusingphysicalprocesses.Replication
improvesphysicalprocessproductivity(duplicatemachines
producemoreitemsfaster),reduceslearningcurveeffectson
SoftwareCost
peopleandspreadsunitcostovermanyitems.
Whereasasoftwareapplicationisasingleproductionitem,i.e.
Estimation
everyapplicationisunique.Theonlyphysicalprocessesare
thedocumentationofideas,theirtranslationintocomputer Thereisnogoodway
instructionsandtheirvalidationandverification.Production toperformasoftware
productivityreduces,notincreases,whenmorepeopleare costbenefitanalysis,
employedtodevelopthesoftwareapplication.Savings breakevenanalysis,or
throughreplicationareonlyrealizedinthedevelopment makeorbuyanalysis
processesandonthelearningcurveeffectsonthe withoutsome
managementandtechnicalstaff.Unitcostisnotreducedby reasonablyaccurate
creatingthesoftwareapplicationoverandoveragain. methodofestimating
Thismanualhelpsanalystsanddecisionmakersdevelop softwarecostsand
accurate,easyandquicksoftwarecostestimatesfordifferent theirsensitivityto
operatingenvironmentssuchasground,shipboard,airand variousproduct,
space.ItwasdevelopedbytheAirForceCostAnalysis project,and
Agency(AFCAA)inconjunctionwithDoDServiceCost environmentalfactors.
Agencies,andassistedbytheUniversityofSouthern BarryBoehm
CaliforniaandtheNavalPostgraduateSchool.Theintentisto
improvequalityandconsistencyofestimatingmethodsacross
costagenciesandprogramofficesthroughguidance,
standardization,andknowledgesharing.
Themanualconsistsofchaptersonmetricdefinitions,e.g.,whatismeantbyequivalentlinesof
code,examplesofmetricdefinitionsfromcommerciallyavailablecostmodels,thedata
collectionandrepositoryform,guidelinesforpreparingthedataforanalysis,analysisresults,
costestimatingrelationshipsfoundinthedata,productivitybenchmarks,futurecostestimation
challengesandaverylargeappendix
Introduction1
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
2 Metrics Definitions
2.1 Size Measures
ThischapterdefinessoftwareproductsizemeasuresusedinCostEstimatingRelationship
(CER)analysis.Thedefinitionsinthischaptershouldbecomparedtothecommercialcost
modeldefinitionsinthenextchapter.Thiswillhelpunderstandwhyestimatesmayvary
betweentheseanalysisresultsinthismanualandothermodelresults.
Forestimationandproductivityanalysis,itisnecessarytohaveconsistentmeasurement
definitions.Consistentdefinitionsmustbeusedacrossmodelstopermitmeaningful
distinctionsandusefulinsightsforprojectmanagement.
MetricsDefinitions2
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Table1SoftwareSizeTypes
Size Type Description
Software created with automated source code generators. The code to
Generated include for equivalent size consists of automated tool generated
statements.
Software that is converted between languages using automated
Converted
translators.
Pre-built commercially available software components. The source code
is not available to application developers. It is not included for
Commercial Off-The- equivalent size.
Shelf Software (COTS) Other unmodified software not included in equivalent size are
Government Furnished Software (GFS), libraries, operating systems and
utilities.
Thesizetypesareappliedatthesourcecodefilelevelfortheappropriatesystemofinterest.Ifa
component,ormodule,hasjustafewlinesofcodechangedthentheentirecomponentis
classifiedasModifiedeventhoughmostofthelinesremainunchanged.Thetotalproductsize
forthecomponentwillincludealllines.
Opensourcesoftwareishandled,aswithothercategoriesofsoftware,dependingonthecontext
ofitsusage.Ifitisnottouchedatallbythedevelopmentteamitcanbetreatedasaformof
COTSorreusedcode.However,whenopensourceismodifieditmustbequantifiedwiththe
adaptationparametersformodifiedcodeandbeaddedtotheequivalentsize.Thecostsof
integratingopensourcewithothersoftwarecomponentsshouldbeaddedintooverallproject
costs.
MetricsDefinitions3
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Table2EquivalentSLOCRulesforDevelopment
Includes Excludes
Statement Type
Executable
Nonexecutable
Declarations
Compiler directives
Comments and blank lines
How Produced
Programmed New
Reused
Modified
Generated
Generator statements
3GL generated statements
(development) (maintenance)
Converted
Origin
New
Adapted
A previous version, build, or release
Unmodified COTS, GFS, library, operating system or utility
Unfortunately,notallSLOCcountsarereportedusingalogicalcounttype.Thereareother
SLOCcounttypes.Thesearediscussednext.
MetricsDefinitions4
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
MetricsDefinitions5
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Table3EquivalentSLOCRulesforDevelopment
Source Includes Excludes
New
Reused
Modified
Generated
Generator statements
3GL generated statements
Converted
COTS
Volatility
MetricsDefinitions6
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
2.3.3 Total Equivalent Size
UsingtheAAFtoadjustAdaptedCodesize,thetotalequivalentsizeis:
Eq 2 Total Equivalent Size = New Size + (AAF x Adapted Size)
AAFassumesalineareffortrelationship,buttherecanalsobenonlineareffects.Dataindicates
thattheAAFfactortendstounderestimatemodificationeffort[Selby1988],[Boehmetal.2001],
[Stutzke2005].TwootherfactorsusedtoaccountfortheseeffectsareSoftwareUnderstanding
andProgrammerUnfamiliarity.ThesetwofactorsandtheirusagearediscussedinAppendix
9.2
2.3.4 Volatility
Volatilityisrequirementsevolutionandchange,butnotcodethrownout.Toaccountforthe
addedeffort,volatilityisexpressedasanadditionalpercentagetosizetoobtainthetotal
equivalentsizeforestimation.
Eq 3 Total Equivalent Size = [New Size + (AAF x Adapted Size)] x (1 + Volitility)
MetricsDefinitions7
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Softwarerequirementsanalysisincludesanyprototypingactivities.Theexcludedactivitiesare
normallysupportedbysoftwarepersonnelbutareconsideredoutsidethescopeoftheir
responsibilityforeffortmeasurement.SystemsRequirementsDevelopmentincludesequations
engineering(forderivedrequirements)andallocationtohardwareandsoftware.
Alltheseactivitiesincludetheeffortinvolvedindocumenting,reviewingandmanagingthe
workinprocess.Theseincludeanyprototypingandtheconductofdemonstrationsduringthe
development.
Transitiontooperationsandoperationsandsupportactivitiesarenotaddressedbythese
analysesforthefollowingreasons:
Theyarenormallyaccomplishedbydifferentorganizationsorteams.
TheyareseparatelyfundedusingdifferentcategoriesofmoneywithintheDoD.
Thecostdatacollectedbyprojectsthereforedoesnotincludethemwithintheirscope.
Fromalifecyclepointofview,theactivitiescomprisingthesoftwarelifecyclearerepresented
fornew,adapted,reused,generatedandCOTS(CommercialOffTheShelf)developments.
ReconcilingtheeffortassociatedwiththeactivitiesintheWorkBreakdownStructure(WBS)
acrosslifecycleisnecessaryforvalidcomparisonstobemadebetweenresultsfromcost
models.
MetricsDefinitions8
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Addingtothecomplexityofmeasuringwhatisincludedineffortdataisthatstaffcouldbe
fulltimeorparttimeandchargetheirhoursasdirectorindirectlabor.Theissueofcapturing
overtimeisalsoaconfoundingfactorindatacapture.
2.5 Schedule
Scheduledataarethestartandenddatefordifferentdevelopmentphases,suchasthosediscuss
in2.4.1.Anotherimportantaspectofscheduledataisentryorstartandexitorcompletion
criteriaeachphase.Thecriteriacouldvarybetweenprojectsdependingonitsdefinition.Asan
exampleofexitorcompletioncriteria,arethedatesreportedwhen:
Internalreviewsarecomplete
Formalreviewwiththecustomeriscomplete
Signoffbythecustomer
Allhighpriorityactionsitemsareclosed
Allactionitemsareclosed
Productsoftheactivity/phaseareplacedunderconfigurationmanagement
InspectionoftheproductsaresignedoffbyQA
Managementsignoff
Anindepthdiscussionisprovidedin[Goethertetal1992].
MetricsDefinitions9
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
CostEstimationModels10
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
leasttwomodelstoestimatecostswheneveritispossibletoprovideaddedassurancethatyou
arewithinanacceptablerangeofvariation.
OtherindustrycostmodelssuchasSLIM,CheckpointandEstimacshavenotbeenasfrequently
usedfordefenseapplicationsastheyaremoreorientedtowardsbusinessapplicationsper
[MadachyBoehm2008].Apreviouscomparativesurveyofsoftwarecostmodelscanalsobe
foundin[Boehmetal.2000b].COCOMOIIisapublicdomainmodelthatUSCcontinually
updatesandisimplementedinseveralcommercialtools.TrueSandSEERSEMareboth
proprietarycommercialtoolswithuniquefeaturesbutalsosharesomeaspectswithCOCOMO.
Allthreehavebeenextensivelyusedandtailoredforflightprojectdomains.SLIMisanother
parametrictoolthatusesadifferentapproachtoeffortandscheduleestimation.
3.2.1 COCOMO II
TheCOCOMO(COnstructiveCOstMOdel)costandscheduleestimationmodelwasoriginally
publishedin1981[Boehm1981].COCOMOIIresearchstartedin1994,andthemodelcontinues
tobeupdatedatUSCwiththerestoftheCOCOMOmodelfamily.COCOMOIIdefinedin
[Boehmetal.2000]hasthreesubmodels:ApplicationsComposition,EarlyDesignandPost
Architecture.Theycanbecombinedinvariouswaystodealwithdifferentsoftware
environments.TheApplicationCompositionmodelisusedtoestimateeffortandscheduleon
projectstypicallydoneasrapidapplicationdevelopment.TheEarlyDesignmodelinvolvesthe
explorationofalternativesystemarchitecturesandconceptsofoperation.Thismodelisbased
onfunctionpoints(orlinesofcodewhenavailable)andasetoffivescalefactorsandseven
effortmultipliers.
ThePostArchitecturemodelisusedwhentopleveldesigniscompleteanddetailed
informationabouttheprojectisavailableandthesoftwarearchitectureiswelldefined.Ituses
SourceLinesofCodeand/orFunctionPointsforthesizingparameter,adjustedforreuseand
breakage;asetof17effortmultipliersandasetoffivescalefactorsthatdeterminethe
economies/diseconomiesofscaleofthesoftwareunderdevelopment.Thismodelisthemost
frequentmodeofestimationandusedthroughoutthismanual.Theeffortformulais:
PM = A x Size x Emi
B
Eq 5
Where
PMiseffortinpersonmonths
Aisaconstantderivedfromhistoricalprojectdata
SizeisinKSLOC(thousandsourcelinesofcode),orconvertedfromothersizemeasures
Bisanexponentforthediseconomyofscaledependentonadditivescaledrivers
EMiisaneffortmultiplierfortheithcostdriver.TheproductofNmultipliersisanoverall
effortadjustmentfactortothenominaleffort.
TheCOCOMOIIeffortisdecomposedbylifecyclephaseandactivityasdetailedin3.3.2.More
informationonCOCOMOcanbefoundat
http://csse.usc.edu/csse/research/COCOMOII/cocomo_main.html.
Awebbasedtoolforthemodelisathttp://csse.usc.edu/tools/COCOMO.
CostEstimationModels11
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
3.2.2 SEER-SEM
SEERSEMisaproductofferedbyGalorath,Inc.ThismodelisbasedontheoriginalJensen
model[Jensen1983],andhasbeenonthemarketover15years.TheJensenmodelderivesfrom
COCOMOandothermodelsinitsmathematicalformulation.However,itsparametric
modelingequationsareproprietary.LikeTrueS,SEERSEMestimatescanbeusedaspartofa
compositemodelingsystemforhardware/softwaresystems.Descriptivematerialaboutthe
modelcanbefoundin[GalorathEvans2006].
Thescopeofthemodelcoversallphasesoftheprojectlifecycle,fromearlyspecification
throughdesign,development,deliveryandmaintenance.Ithandlesavarietyofenvironmental
andapplicationconfigurations,andmodelsdifferentdevelopmentmethodsandlanguages.
Developmentmodescoveredincludeobjectoriented,reuse,COTS,spiral,waterfall,prototype
andincrementaldevelopment.Languagescoveredare3rdand4thgenerationlanguages(C++,
FORTRAN,COBOL,Ada,etc.),aswellasapplicationgenerators.
TheSEERSEMcostmodelallowsprobabilitylevelsofestimates,constraintsonstaffing,effort
orschedule,anditbuildsestimatesuponaknowledgebaseofexistingprojects.Estimate
outputsincludeeffort,cost,schedule,staffing,anddefects.Sensitivityanalysisisalsoprovided
asisariskanalysiscapability.Manysizingmethodsareavailableincludinglinesofcodeand
functionpoints.Formoreinformation,seetheGalorathInc.websiteat
http://www.galorath.com.
3.2.3 SLIM
TheSLIMmodelisbasedonworkdonebyPutnam[Putnam1978]usingtheNorden/Rayleigh
manpowerdistribution.ThecentralpartofPutnamsmodel,calledthesoftwareequation,is
[PutnamMyers1992]:
Eq 6 Product = Productivity Parameter x (Effort/B)1/3 x Time4/3
Where
Productisthenewandmodifiedsoftwarelinesofcodeatdeliverytime
ProductivityParameterisaprocessproductivityfactor
Effortmanyearsofworkbyalljobclassifications
Bisaspecialskillsfactorthatisafunctionofsize
Timeislapsedcalendartimeinyears
TheProductivityParameter,obtainedfromcalibration,hasvaluesthatfallin36quantizedsteps
rangingfrom754to3,524,578.Thespecialskillsfactor,B,isafunctionofsizeintherangefrom
18,000to100,000deliveredSLOCthatincreasesastheneedforintegration,testing,quality
assurance,documentationandmanagementskillsgrows.
Thesoftwareequationcanberearrangedtoestimatetotaleffortinmanyears:
1/3 3 4
Eq 7 Effort = (Size x B / Productivity Parameter) x (1/Time )
PutnamsmodelisusedintheSLIMsoftwaretoolbasedforcostestimationandmanpower
scheduling[QSM2003].
CostEstimationModels12
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
3.2.4 True S
TrueSistheupdatedproducttothePRICESmodelofferedbyPRICESystems.PRICESwas
originallydevelopedatRCAforuseinternallyonsoftwareprojectssuchastheApollomoon
program,andwasthenreleasedin1977asaproprietarymodel.Itfitsintoacomposite
modelingsystemandcanbeusedtoestimatemorethanjustsoftwarecosts.Manyofthe
modelscentralalgorithmswerepublishedin[Park1988].Formoredetailsonthemodeland
themodelingsystemseethePRICESystemswebsiteathttp://www.pricesystems.com.
ThePRICESmodelconsistsofthreesubmodelsthatenableestimatingcostsandschedulesfor
thedevelopmentandsupportofcomputersystems.Themodelcoversbusinesssystems,
communications,commandandcontrol,avionics,andspacesystems.PRICESincludesfeatures
forreengineering,codegeneration,spiraldevelopment,rapiddevelopment,rapidprototyping,
objectorienteddevelopment,andsoftwareproductivitymeasurement.Sizeinputsinclude
SLOC,functionpointsand/orPredictiveObjectPoints(POPs).TheTrueSsystemalsoprovides
aCOCOMOIIcapability.
TheTruePlanningestimationsuitefromPRICESystemscontainsboththeTrueSmodeland
theCOCOMOIIcostmodel.
CostEstimationModels13
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Table5ComparisonofModelSizeInputs
COCOMO II Size Inputs SEER-SEM Size Inputs True S Size Inputs
Reused Software
Reused Size Pre-exists Size 1, 2 Reused Size 2
% Integration Required (IM) Deleted Size Reused Size Non-
Assessment and Assimilation Redesign Required % executable
(AA) Reimplementation Required % of Design Adapted
% % of Code Adapted
Retest Required % % of Test Adapted
Deleted Size
Code Removal Complexity
Generated Code
Auto Generated Code Size
Auto Generated Size Non-
executable
Automatically Translated
Adapted SLOC Auto Translated Code Size
Automatic Translation Auto Translated Size Non-
Productivity executable
% of Code Reengineered
Deleted Code
Volatility
Requirements Evolution and Requirements Volatility
Volatility (REVL) (Change) 3
1 - Specified separately for Designed for Reuse and Not Designed for Reuse
2 - Reused is not consistent with AFCAA definition if DM or CM >0
3 - Not a size input but a multiplicative cost driver
TheprimaryunitofsoftwaresizeintheeffortmodelsisThousandsofSourceLinesofCode
(KSLOC).KSLOCcanbeconvertedfromothersizemeasures,andadditionalsizeunitscanbe
useddirectlyinthemodelsasdescribednext.Userdefinedproxysizescanbedevelopedfor
anyofthemodels.
3.3.1.1 COCOMO II
TheCOCOMOIIsizemodelisbasedonSLOCorfunctionpointsconvertedtoSLOC,andcan
becalibratedandusedwithothersoftwaresizeunits.Examplesincludeusecases,usecase
points,objectpoints,physicallines,andothers.Alternativesizemeasurescanbeconvertedto
linesofcodeanduseddirectlyinthemodeloritcanbeindependentlycalibrateddirectlyto
differentmeasures.
3.3.1.2 SEER-SEM
Severalsizingunitscanbeusedaloneorincombination.SEERcanuseSLOC,functionpoints
andcustomproxies.COTSelementsaresizedwithFeaturesandQuickSize.SEERallowsproxies
CostEstimationModels14
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
asaflexiblewaytoestimatesoftwaresize.Anycountableartifactcanbeestablishedasmeasure.
Customproxiescanbeusedwithothersizemeasuresinaproject.Availablepredefinedproxies
thatcomewithSEERincludeWebSiteDevelopment,MarkIIFunctionPoint,FunctionPoints(for
directIFPUGstandardfunctionpoints)andObjectOrientedSizing.
SEERconvertsallsizedataintointernalsizeunits,alsocalledeffortunits,SizinginSEERSEM
canbebasedonfunctionpoints,sourcelinesofcode,oruserdefinedmetrics.Userscan
combineorselectasinglemetricforanyprojectelementorfortheentireproject.COTSWBS
elementsalsohavespecificsizeinputsdefinedeitherbyFeatures,ObjectSizing,orQuickSize,
whichdescribethefunctionalitybeingintegrated.
NewLinesofCodearetheoriginallinescreatedforthefirsttimefromscratch.
PreExistingsoftwareisthatwhichismodifiedtofitintoanewsystem.Therearetwocategories
ofpreexistingsoftware:
Preexisting,DesignedforReuse
Preexisting,NotDesignedforReuse.
Bothcategoriesofpreexistingcodethenhavethefollowingsubcategories:
Preexistinglinesofcodewhichisthenumberoflinesfromaprevioussystem
LinestobeDeletedarethoselinesdeletedfromaprevioussystem.
RedesignRequiredisthepercentageofexistingcodethatmustberedesignedtomeetnewsystem
requirements.
ReimplementationRequiredisthepercentageofexistingcodethatmustbereimplemented,
physicallyrecoded,orreenteredintothesystem,suchascodethatwillbetranslatedinto
anotherlanguage.
RetestRequiredisthepercentageofexistingcodethatmustberetestedtoensurethatitis
functioningproperlyinthenewsystem.
SEERthenusesdifferentproportionalweightswiththeseparametersintheirAAFequation
accordingto:
Eq 8 Pre-existing Effective Size = (0.4 x A) + (0.25 x B) + (0.3 5x C)
Where
Aisthepercentagesofcoderedesign
Bisthepercentagesofcodereimplementation
Cisthepercentagesofcoderetestrequired
SEERalsohasthecapabilitytotakealternativesizeinputs:
CostEstimationModels15
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Function-Point Based Sizing
ExternalInput(EI)
ExternalOutput(EO)
InternalLogicalFile(ILF)
ExternalInterfaceFiles(EIF)
ExternalInquiry(EQ)
InternalFunctions(IF)anyfunctionsthatareneitherdatanortransactions
Proxies
WebSiteDevelopment
MarkIIFunctionPoints
FunctionPoints(direct)
ObjectOrientedSizing.
COTS Elements
QuickSize
ApplicationTypeParameter
FunctionalityRequiredParameter
Features
NumberofFeaturesUsed
UniqueFunctions
DataTablesReferenced
DataTablesConfigured
3.3.1.3 True S
TheTrueSsoftwarecostmodelsizemeasuresmaybeexpressedindifferentsizeunits
includingSourceLinesofCode(SLOC),functionpoints,PredictiveObjectPoints(POPs)orUse
CaseConversionPoints(UCCPs).TrueSalsodifferentiatesexecutablefromnonexecutable
softwaresizes.FunctionalSizedescribessoftwaresizeintermsofthefunctionalrequirements
thatyouexpectaSoftwareCOTScomponenttosatisfy.TheTrueSsoftwarecostmodelsize
definitionsforallofthesizeunitsarelistedbelow.
AdaptedCodeSize
Thisdescribestheamountofexistingcodethatmustbechanged,deleted,oradaptedforuse
inthenewsoftwareproject.Whenthevalueiszero(0.00),thevalueforNewCodeSizeor
ReusedCodeSizemustbegreaterthanzero.
AdaptedSizeNonexecutable
Thisvaluerepresentsthepercentageoftheadaptedcodesizethatisnonexecutable(suchas
datastatements,typedeclarations,andothernonproceduralstatements).Typicalvaluesfor
fourthgenerationlanguagesrangefrom5.00percentto30.00percent.Whenavaluecannot
beobtainedbyanyothermeans,thesuggestednominalvaluefornonexecutablecodeis
15.00percent.
CostEstimationModels16
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
AmountforModification
Thisrepresentsthepercentofthecomponentfunctionalitythatyouplantomodify,ifany.
TheAmountforModificationvalue(likeGlueCodeSize)affectstheeffortcalculatedforthe
SoftwareDesign,CodeandUnitTest,PerformSoftwareIntegrationandTest,andPerform
SoftwareQualificationTestactivities.
AutoGenSizeNonexecutable
ThisvaluerepresentsthepercentageoftheAutoGeneratedCodeSizethatisnonexecutable
(suchas,datastatements,typedeclarations,andothernonproceduralstatements).Typical
valuesforfourthgenerationlanguagesrangefrom5.00percentto30.00percent.Ifavalue
cannotbeobtainedbyanyothermeans,thesuggestednominalvaluefornonexecutable
codeis15.00percent.
AutoGeneratedCodeSize
Thisvaluedescribestheamountofcodegeneratedbyanautomateddesigntoolfor
inclusioninthiscomponent.
AutoTransSizeNonexecutable
ThisvaluerepresentsthepercentageoftheAutoTranslatedCodeSizethatisnon
executable(suchas,datastatements,typedeclarations,andothernonprocedural
statements).Typicalvaluesforfourthgenerationlanguagesrangefrom5.00percentto30.00
percent.Ifavaluecannotbeobtainedbyanyothermeans,thesuggestednominalvaluefor
nonexecutablecodeis15.00percent.
AutoTranslatedCodeSize
Thisvaluedescribestheamountofcodetranslatedfromoneprogramminglanguageto
anotherbyusinganautomatedtranslationtool(forinclusioninthiscomponent).
AutoTranslationToolEfficiency
Thisvaluerepresentsthepercentageofcodetranslationthatisactuallyaccomplishedbythe
tool.Moreefficientautotranslationtoolsrequiremoretimetoconfigurethetooltotranslate.
Lessefficienttoolsrequiremoretimeforcodeandunittestoncodethatisnottranslated.
CodeRemovalComplexity
Thisvaluedescribesthedifficultyofdeletingcodefromtheadaptedcode.Twothingsneed
tobeconsideredwhendeletingcodefromanapplicationorcomponent:theamountof
functionalitybeingremovedandhowtightlyorlooselythisfunctionalityiscoupledwith
therestofthesystem.Evenifalargeamountoffunctionalityisbeingremoved,ifitis
accessedthroughasinglepointratherthanfrommanypoints,thecomplexityofthe
integrationwillbereduced.
DeletedCodeSize
Thisdescribestheamountofpreexistingcodethatyouplantoremovefromtheadapted
codeduringthesoftwareproject.TheDeletedCodeSizevaluerepresentscodethatis
includedinAdaptedCodeSize,therefore,itmustbelessthan,orequalto,theAdapted
CodeSizevalue.
CostEstimationModels17
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Equivalent Source Lines of Code
TheESLOC(EquivalentSourceLinesofCode)valuedescribesthemagnitudeofaselectedcost
objectinEquivalentSourceLinesofCodesizeunits.TrueSdoesnotuseESLOCinroutine
modelcalculations,butprovidesanESLOCvalueforanyselectedcostobject.Different
organizationsusedifferentformulastocalculateESLOC.
TheTrueScalculationforESLOCis:
Eq 9 ESLOC = New Code + (0.7 x Adapted Code) + (0.1 x Reused Code)
TocalculateESLOCforaSoftwareCOTS,TrueSfirstconvertsFunctionalSizeandGlueCode
SizeinputstoSLOCusingadefaultsetofconversionrates.NewCodeincludesGlueCodeSize
andFunctionalSizewhenthevalueofAmountforModificationisgreaterthanorequalto25%.
AdaptedCodeincludesFunctionalSizewhenthevalueofAmountforModificationislessthan
25%andgreaterthanzero.ReusedCodeincludesFunctionalSizewhenthevalueofAmount
forModificationequalszero.
FunctionalSize
Thisvaluedescribessoftwaresizeintermsofthefunctionalrequirementsthatyouexpecta
SoftwareCOTScomponenttosatisfy.WhenyouselectFunctionalSizeastheunitof
measure(SizeUnitsvalue)todescribeaSoftwareCOTScomponent,theFunctionalSize
valuerepresentsaconceptuallevelsizethatisbasedonthefunctionalcategoriesofthe
software(suchasMathematical,DataProcessing,orOperatingSystem).Ameasureof
FunctionalSizecanalsobespecifiedusingSourceLinesofCode,FunctionPoints,Predictive
ObjectPointsorUseCaseConversionPointsifoneoftheseistheSizeUnitselected.
GlueCodeSize
ThisvaluerepresentstheamountofGlueCodethatwillbewritten.GlueCodeholdsthe
systemtogether,providesinterfacesbetweenSoftwareCOTScomponents,interpretsreturn
codes,andtranslatesdataintotheproperformat.Also,GlueCodemayberequiredto
compensateforinadequaciesorerrorsintheCOTScomponentselectedtodeliverdesired
functionality.
NewCodeSize
Thisvaluedescribestheamountofentirelynewcodethatdoesnotreuseanydesign,code,
ortestartifacts.Whenthevalueiszero(0.00),thevaluemustbegreaterthanzerofor
ReusedCodeSizeorAdaptedCodeSize.
NewSizeNonexecutable
ThisvaluedescribesthepercentageoftheNewCodeSizethatisnonexecutable(suchas
datastatements,typedeclarations,andothernonproceduralstatements).Typicalvaluesfor
fourthgenerationlanguagesrangefrom5.0percentto30.00percent.Ifavaluecannotbe
obtainedbyanyothermeans,thesuggestednominalvaluefornonexecutablecodeis15.00
percent.
PercentofCodeAdapted
Thisrepresentsthepercentageoftheadaptedcodethatmustchangetoenabletheadapted
codetofunctionandmeetthesoftwareprojectrequirements.
CostEstimationModels18
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
PercentofDesignAdapted
Thisrepresentsthepercentageoftheexisting(adaptedcode)designthatmustchangeto
enabletheadaptedcodetofunctionandmeetthesoftwareprojectrequirements.Thisvalue
describestheplannedredesignofadaptedcode.Redesignincludesarchitecturaldesign
changes,detaileddesignchanges,andanynecessaryreverseengineering.
PercentofTestAdapted
Thisrepresentsthepercentageoftheadaptedcodetestartifactsthatmustchange.Testplans
andotherartifactsmustchangetoensurethatsoftwarethatcontainsadaptedcodemeets
theperformancespecificationsoftheSoftwareComponentcostobject.
ReusedCodeSize
Thisvaluedescribestheamountofpreexisting,functionalcodethatrequiresnodesignor
implementationchangestofunctioninthenewsoftwareproject.Whenthevalueiszero
(0.00),thevaluemustbegreaterthanzeroforNewCodeSizeorAdaptedCodeSize.
ReusedSizeNonexecutable
ThisvaluerepresentsthepercentageoftheReusedCodeSizethatisnonexecutable(such
as,datastatements,typedeclarations,andothernonproceduralstatements).Typicalvalues
forfourthgenerationlanguagesrangefrom5.00percentto30.00percent.Ifavaluecannot
beobtainedbyanyothermeans,thesuggestednominalvaluefornonexecutablecodeis
15.00percent.
3.3.1.4 SLIM
SLIMuseseffectivesystemsizecomposedofnewandmodifiedcode.Deletedcodeisnot
consideredinthemodel.Ifthereisreusedcode,thentheProductivityIndex(PI)factormaybe
adjustedtoaddintimeandeffortforregressiontestingandintegrationofthereusedcode.
SLIMprovidesdifferentsizingtechniquesincluding:
Sizingbyhistory
Totalsystemmapping
Sizingbydecomposition
Sizingbymodule
Functionpointsizing.
AlternativesizestoSLOCsuchasusecasesorrequirementscanbeusedinTotalSystem
Mapping.Theuserdefinesthemethodandquantitativemappingfactor.
CostEstimationModels19
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
InSEERSEMthestandardlifecycleactivitiesinclude:(1)SystemConcept,(2)System
RequirementsDesign,(3)SoftwareRequirementsAnalysis,(4)PreliminaryDesign,(5)Detailed
Design,(6)CodeandUnitTest,(7)ComponentIntegrationandTesting,(8)ProgramTest,(9)
SystemsIntegrationthroughOT&E&Installation,and(10)OperationSupport.Activitiesmay
bedefineddifferentlyacrossdevelopmentorganizationsandmappedtoSEERSEMs
designations.
InSLIMthelifecyclemapstofourgeneralphasesofsoftwaredevelopment.Thedefaultphases
are:1)ConceptDefinition,2)RequirementsandDesign,3)ConstructandTest,and4)Perfective
Maintenance.Thephasenames,activitydescriptionsanddeliverablescanbechangedinSLIM.
ThemainbuildphaseinitiallycomputedbySLIMincludesthedetaileddesignthrough
systemtestphases,butthemodelhastheoptiontoincludetherequirementsanddesign
phase,includingsoftwarerequirementsandpreliminarydesign,andafeasibilitystudyphase
toencompasssystemrequirementsanddesign.
ThephasescoveredinthemodelsaresummarizedintheTable6.
CostEstimationModels20
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Table6LifecyclePhaseCoverage
Model Phases
Inception
Elaboration
COCOMO II
Construction
Transition
System Concept
System Requirements Design
Software Requirements Analysis
Preliminary Design
Detailed Design
SEER-SEM
Code / Unit Test
Component Integration and Testing
Program Test
System Integration Through OT&E and Installation
Operation Support
Concept
System Requirements
Software Requirements
Preliminary Design
Detailed Design
True S Code / Unit Test
Integration and Test
Hardware / Software Integration
Field Test
System Integration and Test
Maintenance
Concept Definition
Requirements and Design
SLIM
Construction and Test
Perfective Maintenance
CostEstimationModels21
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
TheworkactivitiesestimatedintherespectivetoolsareinTable7.
Table7WorkActivitiesCoverage
Model Activities
Management
Environment / CM
Requirements
COCOMO II Design
Implementation
Assessment
Deployment
Management
Software Requirements
Design
Code
SEER-SEM
Data Programming
Test
CM
QA
Design
Programming
Data
True S
SEPGM
QA
CFM
WBS Sub-elements of Phases:
Concept Definition
SLIM Requirements and Design
Construct and Test
Perfective Maintenance
ThecategoriesoflaborcoveredintheestimationmodelsandtoolsarelistedinTable8.
Table8LaborActivitiesCovered
Model Categories
COCOMO II Software Engineering Labor*
Software Engineering Labor*
SEER-SEM
Purchases
Software Engineering Labor*
Purchased Good
True S
Purchased Service
Other Cost
SLIM Software Engineering Labor
* Project Management (including contracts), Analysts,
Designers, Programmers, Testers, CM, QA, and Documentation
CostEstimationModels22
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
1http://dcarc.cape.osd.mil/CSDR/CSDROverview.aspx
SoftwareResourceDataReport(SRDR)23
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Table9SRDRReportingEvents
Report
Event Due Who Provides Scope of Report
At start of each Initial Contractor Estimates for completion for the build only.
build
Perhapsitisnotreadilyapparenthowimportantitistounderstandthesubmissioncriteria.
SRDRrecordsareamixtureofcompletecontractsandindividualbuildswithinacontract.And
thereareinitialandfinalreportsalongwithcorrections.Mixingcontractdataandbuilddataor
mixinginitialandfinalresultsornotusingthelatestcorrectedversionwillproduce
inconclusive,ifnotincorrect,results.
Thereportconsistsoftwopages,seeChapter9.4.Thefieldsineachpagearelistedbelow.
SoftwareResourceDataReport(SRDR)24
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
SoftwareResourceDataReport(SRDR)26
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
4.3.3 Product Size Reporting (SRDR Section 3.3)
NumberofSoftwareRequirements.Providetheactualnumberofsoftwarerequirements.
TotalRequirements.Entertheactualnumberoftotalrequirementssatisfiedbythe
developedsoftwareproductatthecompletionoftheincrementorproject.
NewRequirements.Ofthetotalactualnumberofrequirementsreported,identifyhow
manyarenewrequirements.
NumberofExternalInterfaceRequirements.Providethenumberofexternalinterface
requirements,asspecifiedbelow,notunderprojectcontrolthatthedevelopedsystem
satisfies.
TotalExternalInterfaceRequirements.Entertheactualnumberoftotalexternalinterface
requirementssatisfiedbythedevelopedsoftwareproductatthecompletionofthe
incrementorproject.
NewExternalInterfaceRequirements.Ofthetotalnumberofexternalinterface
requirementsreported,identifyhowmanyarenewexternalinterfacerequirements.
RequirementsVolatility.Indicatetheamountofrequirementsvolatilityencounteredduring
developmentasapercentageofrequirementsthatchangedsincetheSoftwareRequirements
Review.
SoftwareSize.
DeliveredSize.Capturethedeliveredsizeoftheproductdeveloped,notincludingany
codethatwasneededtoassistdevelopmentbutwasnotdelivered(suchastemporary
stubs,testscaffoldings,ordebugstatements).Additionally,thecodeshallbepartitioned
(exhaustivewithnooverlaps)intoappropriatedevelopmentcategories.Acommonset
ofsoftwaredevelopmentcategoriesisnew,reusedwithmodification,reusedwithout
modification,carryovercode,deletedcode,andautogeneratedcode.
ReusedCodeWithModification.Whencodeisincludedthatwasreusedwith
modification,provideanassessmentoftheamountofredesign,recode,andretest
requiredtoimplementthemodifiedorreusedcode.
ReuseCodeWithoutModification.Codereusedwithoutmodificationiscodethat
hasnodesignorcodemodifications.However,theremaybeanamountofretest
required.Percentageofretestshouldbereportedwiththeretestfactorsdescribed
above.
CarryoverCode.Reportshalldistinguishbetweencodedevelopedinprevious
incrementsthatiscarriedforwardintothecurrentincrementandcodeaddedaspart
oftheeffortonthecurrentincrement.
DeletedCode.Includetheamountofdeliveredcodethatwascreatedand
subsequentlydeletedfromthefinaldeliveredcode.
AutogeneratedCode.Ifthedevelopedsoftwarecontainsautogeneratedsource
code,reportanautogeneratedcodesizingpartitionaspartofthesetof
developmentcategories.
SubcontractorDevelopedCode.
SoftwareResourceDataReport(SRDR)27
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
CountingConvention.Identifythecountingconventionusedtocountsoftwaresize.
SizeReportingbyProgrammingLanguage(Optional).
StandardizedCodeCounting(Optional).Ifrequested,thecontractorshalluseapublicly
availableanddocumentedcodecountingtool,suchastheUniversityofSouthern
CaliforniaCodeCounttool,toobtainasetofstandardizedcodecountsthatreflect
logicalsize.Theseresultsshallbeusedtoreportsoftwaresizing.
The Final Developer Report shall contain actual schedules and actual total effort for each
software development activity.
Effort.Theunitsofmeasureforsoftwaredevelopmenteffortshallbereportedinstaff
hours.Effortshallbepartitionedintodiscretesoftwaredevelopmentactivities.
WBSMapping.
SubcontractorDevelopmentEffort.TheeffortdataintheSRDreportshallbeseparated
intoaminimumoftwodiscretecategoriesandreportedseparately:PrimeContractor
OnlyandAllOtherSubcontractors.
Schedule.Foreachsoftwaredevelopmentactivityreported,providetheactualstartand
enddatesforthatactivity.
SoftwareResourceDataReport(SRDR)28
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
4.3.6 Data Dictionary
TheSRDRDataDictionarycontains,ataminimum,thefollowinginformationinadditiontothe
specificrequirementsidentifiedinSections3.1through3.5:
ExperienceLevels.Providethecontractorsspecificdefinition(i.e.,thenumberofyearsof
experience)forpersonnelexperiencelevelsreportedintheSRDreport.
SoftwareSizeDefinitions.Providethecontractorsspecificinternalrulesusedtocount
softwarecodesize.
SoftwareSizeCategories.Foreachsoftwaresizecategoryidentified(i.e.,New,Modified,
Unmodified,etc.),providethecontractorsspecificrulesand/ortoolsusedforclassifying
codeintoeachcategory.
PeakStaffing.Provideadefinitionthatdescribeswhatactivitieswereincludedinpeak
staffing.
RequirementsCount(Internal).Providethecontractorsspecificrulesand/ortoolsusedto
countrequirements.
RequirementsCount(External).Providethecontractorsspecificrulesand/ortoolsusedto
countexternalinterfacerequirements.
RequirementsVolatility.Providethecontractorsinternaldefinitionsusedforclassifying
requirementsvolatility.
SoftwareDevelopmentActivities.Providethecontractorsinternaldefinitionsoflabor
categoriesandactivitiesincludedintheSRDreportssoftwareactivity.
ProductQualityReporting.Providethecontractorsinternaldefinitionsforproductquality
metricsbeingreportedandspecificrulesand/ortoolsusedtocountthemetrics.
SoftwareResourceDataReport(SRDR)29
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
5.1 Workflow
Thedataassessmentandprocessingworkflowhassixsteps.Thisworkflowwasusedinthe
analysisoftheSRDRdata.Eachofthesestepsisdescribedindetail.
1. Gatherthedatathathasbeencollected.
2. Reviewandinspecteachdatapoint.
3. Determineaquantitativequalitylevelbasedonthedatainspection.
4. Correctmissingorquestionabledata.Therewereseveralthingsthatcanbedoneaboutthis.
Datathatcannotberepairedisexcludedfromtheanalysis.
5. Thedatahastobenormalizedtoacommonunitofmeasureorscopeofwhatiscoveredby
thedata.
6. FinallythedataissegmentedbyOperatingEnvironmentandSoftwareDomain.
DataAssessmentandProcessing30
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Thedatahastobetransformedfromdifferentformatsintoacommondataformatthatsupports
theanalysisobjectives.Acommondataformatforcostestimationanalysiswouldbedifferent
foranalysisofrequirementsgrowth,defectdiscovery/removalorprocessimprovementreturn
oninvestmenttonameafew.
Thecommondataformatforcostestimationanalysisrequiresdetailinformationon:
Amountofworkload(expressedasafunctionalmeasureoraproductmeasure)
Developmentandsupporteffort
Projectorbuildduration
Additionalcontextualdataisneededtoprovideinformationonwhatthedatarepresents,e.g.,
Organizationthatdevelopedthesoftware
Whattheapplicationdoes
Wherethesoftwarefitsintothesystem(isitallofthesoftware,abuild,aconfiguration
item,orasmallsoftwareunit)
ThecommondataformatusedinanalyzingSRDRdatahadadditionalinformationthanwas
foundintheSRDRreport.
DataAssessmentandProcessing31
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Size Data
Doesthesizedatalooksound?
Isthesizepartofmultibuildrelease?
Wasallcodeautogenerated?
WascoderewrittenafterAG?
Wasaportionofalegacysystemincludedinthesizingdata?
Howmuchsoftwarewasadapted(modified)?
Howmuchsoftwarewasreused(nochanges)?
Isthereeffortandscheduledataforeachsoftwareactivity?
Isthererepeatingsizedata?
Effort Data
Whatlaborwasincludedinthereportedhours?
Engineeringlabor
Managementlabor
Supportlabor:CM,QA,ProcessImprovement,Safety,Security,Dev.Environment
support
WhatlaborwasreportedintheOtheractivity?
WasRequirementseffortreportedforallbuilds?
Weretherecontinuousintegrationactivitiesacrossallbuilds?
Schedule Data
Wasthereschedulecompressionmentionedontheproject
Werethereparallelmultiplebuilds(samestart&enddate)
Productivity Screening
Isaquickproductivitycheckreasonablyclosetosoftwarewithsimilarfunctionality?
Isthisrecordanoutlierinascatterplotwithothersimilardata?
DataAssessmentandProcessing32
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
5.1.3 Determine Data Quality Levels
Fromtheinspectionprocess,assigntherecordadataqualityrating.ThecriteriainTable10can
beusedtodetermineratingvalues.
Table10DataQualityRatingScale
Attribute Value Condition
1.0 if size data present
Size:
0 if no size data
1.0 if size is Logical SLOC
Size Count Type: 0.7 if size is Non-Commented Source Statements
(providing size 0.5 if size is Physical Lines (Comment and Source Statements)
data is present) 0.4 if size is Total Lines (all lines in file: blank, comment, source)
0 if no size data
1.0 if modification parameters provided for Auto-gen, Modified &
Reuse
ESLOC
0.5 if New SLOC and no size data for Auto-gen, Modified or Reuse
Parameters:
0 if no modification parameters provided for either Modified, Auto-
gen, or Reused SLOC counts
1.0 if Total Size is 5,000 < Size < 250,000
CSCI-level Data:
0 if Total Size < 5,000 or Size > 250,000
1.0 if effort reported for all phases
Effort: 0.5 if effort is reported as a total
0 if effort is missing for a phase
1.0 if duration reported for all phases
Schedule: 0.5 if duration is reported as a total
0 is duration is missing for a phase
1.0 if record is in the expected value range
Productivity: 0.5 if record is within 1 standard deviation from the mean
0 if record is a clear outlier
Aseachrecordisratedbythecriteriaabove,anoverallqualitylevelisassignedby:
Eq 10 Quality Level = (Size + Size Count Type + ESLOC Parameters +
CSCI level + Effort + Schedule + Productivity) / 7
Thequalitylevelisaquickindicatorofthedegreeofissuesfoundintherecord.Astherecorded
iscorrectedthroughsupplementalinformation,theratingisrevised.Becausetherangeofthe
qualitylevelscaleisbetween0and1.0,itcouldbeusedasaweightduringanalysis.
DataAssessmentandProcessing33
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
5.1.4 Correct Missing or Questionable Data
Thequalitylevelmakesclearwhichrecordsneedadditionalwork.Thereareseveralapproaches
availabletoresolvingmissingorquestionabledata.Thesearelistedinarecommendedorder:
1. ConsulttheaccompanyingDataDictionarydiscussedinChapter4.3.6
2. Consultanysupplementalontheprojectthatisavailable,e.g.,ASP,CARD,CDD,EVMS,
SRS,WBS,etc.
3. SchedulingfollowupmeetingswithSRDRdatacontributor.Dataqualityissuesthatwere
fixedinthepastbytheSRDRcontributor:
Revisedmissingsize,effortanddurationdata
ObtainedAdaptationAdjustmentFactor(AAF)parameters
Confirmedproductivitytypeandenvironment
ConfirmedCSCIlevelofreporting
Askedaboutproblemswithhigh/low,long/shortsize,effortanddurationdata
Asaresultofinspectingthedataandattemptingtocorrecttheissuesfound,nobaddataor
outliersareexcludedfromtheanalysisonarbitrarygrounds.However,dataissuesthat
cannotberesolvedareexcludedfromanalysis.
DataAssessmentandProcessing34
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
IfasourcelineofcodecountwasdefinedaseitherTotalorNCSS,thesecountswereconverted
toaLogicalSLOCcount.AnexperimentwasrunusingtheUCCtool,describedinAppendix
9.2,onpublicdomainsoftwareapplicationsandadditionalcontributionsfromUSCCSSE
Affiliates.Total,NCSSandLogicalcountsweretakenfromtheprogramfiles.Sixprogramming
languagesweresampled:
Ada
C#
C/C++
Java
PERL
PHP
Thetotalnumberofdatapointswas40.Theresultsofthisexperimentaredescribednext.
NCSS Line Count Conversion to Logical
ThesizecountsforNCSSandLogicalwereanalyzedfortheirrelationship.Twoanalyseswere
conducted,oneforallofthesizedataandanotherforthelower80%ofthesizedata.Thetwo
relationshipsareexpressedasfollows(theinterceptwasconstrainedtozero2):
Eq 11 All Sizes: Logical SLOC count = 0.44 x NCSS count
Eq 12 Lower 80%: Logical SLOC count = 0.66 x NCSS count
ThestatisticsfortheserelationshipsareinTable11andascatterplotinFigure1.
Table11NCSSLogicalRelationshipStatistics
2 Whenmodelingthisrelationship,anoverheadamount(asrepresentedbyaninterceptvalue)doesnot
makesense,i.e.,thereisnooverheadiftherearezerolinestobeconverted.Incidentally,whenthe
regressionwasrunonallsizeswithoutthezeroconstraint,theconstanthadaTstatisticof1.90andaP
levelof0.70.
DataAssessmentandProcessing35
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Table12TotalLogicalRelationshipStatistics
DataAssessmentandProcessing36
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Conclusion
The80%solutionwasusedinthisanalysis.The80%conversionfactorsappeartobemore
reasonablethanthe100%factors.Afutureversionofthismanualwillexplorethe
relationshipsforNCSSandTotalcountstoLogicalcountsforeachofthesixprogramming
languages.
DataAssessmentandProcessing37
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
AutoGeneratedcodedoesnotrequiretheDMorCMadaptionparameters.However,it
doesrequiretesting,IM.IfAutoGeneratedcodedoesrequiremodification,thenitbecomes
ModifiedcodeandtheadaptationfactorsforModifiedcodeapply.
ReusecodedoesnotrequiretheDMorCMadaptionparameterseither.Italsorequires
testing,IM.IfReusedcodedoesrequiremodification,thenitbecomesModifiedcodeand
theadaptationfactorsforModifiedcodeapply.
Modifiedcoderequiresthethreeparameters,DM,CMandIM,representingmodifications
tothemodifiedcodedesign,codeandintegrationtesting.
Table13showsDM,CMandIMfordifferentproductivitytypes.Thetableshowsthecodetype,
numberofrecordsusedtoderivetheadaptationparameters,themeanvalueoftheparameter
withits95%confidenceinterval,andthemeanvalue.Theadaptationadjustmentfactor(AAF)
isshowninthelastcolumn.Thisfactoristheportionofadaptedcodethatwillbeusedfor
equivalentSLOC.Unfortunatelytherewasnotenoughdatatosupportreportingforall
productivitytypes.
Table13AdaptedCodeParameters
DM CM IM
PT Code Type # Mean M Mean M Mean M AAF
Auto-Gen 0 0.00 0.00 0.00 0.00
SCP Reused 18 0.51 0.21 0.42 0.15
Modified 7 0.26 0.22 0.25 0.33 0.20 0.50 0.66 0.40 1.00 0.40
Auto-Gen 0 0.00 0.00 0.00 0.00
RTE Reused 8 0.17 0.23 0.10 0.05
Modified 14 0.13 0.10 0.05 0.30 0.19 0.10 0.95 0.07 1.00 0.85
Auto-Gen 1 0.13 0.00 0.13 0.04
MP Reused 12 0.36 0.20 0.33 0.11
Modified 21 0.75 0.12 1.00 0.89 0.12 1.00 0.95 0.07 1.00 0.85
Auto-Gen 12 0.37 0.25 0.13 0.11
SYS Reused 6 0.56 0.40 0.50 0.17
Modified 14 0.22 0.19 0.03 0.34 0.20 0.17 0.68 0.18 0.58 0.39
Auto-Gen 7 0.12 0.20 0.10 0.04
SCI Reused 15 0.46 0.20 0.33 0.14
Modified 10 0.34 0.30 0.17 0.53 0.30 0.41 0.78 0.18 0.88 0.53
Auto-Gen 2 0.33 0.00 0.33 0.10
IIS Reused 7 0.45 0.24 0.33 0.14
Modified 4 1.00 0.00 1.00 0.81 0.60 1.00 0.90 0.32 1.00 0.91
Generalobservationsandusageguidelinesare:
Themorerealtimenatureofthesoftware,thelessthedesignismodified,i.e.Inteland
InformationSystems(IIS)haveaDMof100%whereasSensorControlandSignalProcessing
(SCP)haveaDMof26%.
ThesameisgenerallytrueforCM.Therealtimenatureappearstoinfluencehowmuch
codeismodified.
DataAssessmentandProcessing38
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
IMisusuallyhigherthaneitherDMorCM.Itthesoftwarebeingestimatedrequiresmore
reliabilityorismorecomplex,ahighervalueforIMshouldbeused.
WhilethemeanvalueisprovidedforDM,CMandIM,comparethemeantothemedian.Thisis
anindicationofskewinginthedata.Thisshouldalsoinfluenceyourdecisiononwhichvalues
tochoosewithinthe95%confidenceinterval.
Afutureversionofthismanualwillprocessmoredataandexpandtheadaptedcodeparameter
tabletoadditionalproductivitytypes.Itwillalsoanalyzetheseparametersacrossoperating
environments.
Afutureversionofthismanualwillprocessmoredataandexpandtheaverageeffort
percentagestabletoadditionalproductivitytypes.Additionally,analysisofscheduleduration
forthedifferentactivitieswillbeconducted.
DataAssessmentandProcessing39
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
InsteadofdevelopingCERsandSERswithmanyparameters,theapproachtakenbythisproject
isbasedongroupingsimilarsoftwareapplicationstogether.Thesegroupsarecalled
ApplicationDomains.ApplicationDomainsimplementacombinationofhardwareand
softwarecomponentstoachievetheintendedfunctionality.However,becauseApplication
Domainsthentorepresentanentiresubsystem,e.g.Communications,theapproachtakenwas
touseagenericdescriptionofsoftwaredomainscalledproductivitytypes(PT).Theoperating
environmentforeachPTisconsideredaswell.Boththeoperatingenvironmentanddomainare
consideredinthisanalysistoproducetheproductivitytypes.
Highspeedvehicleversusstationary
Batteryoperatedversusgroundpower
Unrecoverableplatformversusreadilyaccessible
Limited,nonupgradeablecomputingprocessorcapacityversusracksofprocessors
Fixedinternalandexternalmemorycapacityversusexpandablecapacity
Table15OperatingEnvironments
Operating Environment (OpEnv) Examples
Command Post, Ground Operations Center,
Fixed (GSF)
Ground Terminal, Test Faculties
Ground Site (GS)
Intelligence gathering stations mounted on
Mobile (GSM)
vehicles, Mobile missile launcher
Manned (GVM) Tanks, Howitzers, Personnel carrier
Ground Vehicle (GV)
Unmanned (GVU) Robotic vehicles
Aircraft carriers, destroyers, supply ships,
Manned (MVM) submarines
Maritime Vessel (MV)
Unmanned (MVU) Mine hunting systems, Towed sonar array
Manned (AVM) Fixed-wing aircraft, Helicopters
Aerial Vehicle (AV)
Unmanned (AVU) Remotely piloted air vehicles
Passenger vehicle, Cargo vehicle, Space
Manned (SVM)
station
Space Vehicle (SV)
Orbiting satellites (weather,
Unmanned (SVU)
communications), Exploratory space vehicles
Air-to-air missiles, Air-to-ground missiles, Smart
Ordinance Vehicle (OV) Unmanned (OVU)
bombs, Strategic missiles
DataAssessmentandProcessing40
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
The operating environments can be aggregated into six high-level environments. This is useful
when there is not enough data for each of the 11 environments in Table 15:
1. GroundSite(GS)
2. GroundVehicle(GV)
3. MaritimeVessel(MV)
4. AerialVehicle(AV)
5. SpaceVehicle(SV)
6. OrdinanceVehicle(OV)
Productivity types are groups of application productivities that are characterized by the
following:
Requiredsoftwarereliability
Databasesizeifthereisalargedataprocessingandstoragecomponenttothesoftware
application
Productcomplexity
Integrationcomplexity
Realtimeoperatingrequirements
Platformvolatility,Targetsystemvolatility
Specialdisplayrequirements
Developmentrehosting
Qualityassurancerequirements
Securityrequirements
Assurancerequirements
Requiredtestinglevel
DataAssessmentandProcessing41
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Table16ProductivityTypes
PT Description
Sensor Control and Signal Software that requires timing-dependent device coding to
Processing (SCP) enhance, transform, filter, convert, or compress data signals.
Ex.: Beam steering controller, sensor receiver / transmitter control,
sensor signal processing, sensor receiver / transmitter test.
Ex. of sensors: antennas, lasers, radar, sonar, acoustic,
electromagnetic.
Vehicle Control (VC) Hardware & software necessary for the control of vehicle primary
and secondary mechanical devices and surfaces.
Ex: Digital Flight Control, Operational Flight Programs, Fly-By-Wire
Flight Control System, Flight Software, Executive.
Vehicle Payload (VP) Hardware & software which controls and monitors vehicle
payloads and provides communications to other vehicle
subsystems and payloads.
Ex: Weapons delivery and control, Fire Control, Airborne
Electronic Attack subsystem controller, Stores and Self-Defense
program, Mine Warfare Mission Package.
Real Time Embedded (RTE) Real-time data processing unit responsible for directing and
processing sensor input / output.
Ex: Devices such as Radio, Navigation, Guidance, Identification,
Communication, Controls And Displays, Data Links, Safety, Target
Data Extractor, Digital Measurement Receiver, Sensor Analysis,
Flight Termination, Surveillance, Electronic Countermeasures,
Terrain Awareness And Warning, Telemetry, Remote Control.
Mission Processing (MP) Vehicle onboard master data processing unit(s) responsible for
coordinating and directing the major mission systems.
Ex.: Mission Computer Processing, Avionics, Data Formatting, Air
Vehicle Software, Launcher Software, Tactical Data Systems,
Data Control And Distribution, Mission Processing, Emergency
Systems, Launch and Recovery System, Environmental Control
System, Anchoring, Mooring and Towing.
Process Control (PC) Software that manages the planning, scheduling and execution
of a system based on inputs, generally sensor driven.
System Software (SYS) Layers of software that sit between the computing platform and
applications.
Ex: Health Management, Link 16, Information Assurance,
Framework, Operating System Augmentation, Middleware,
Operating Systems.
Planning Software (PLN) Provides the capability to maximize the use of the platform. The
system supports all the mission requirements of the platform and
may have the capability to program onboard platform systems
with routing, targeting, performance, map, and Intel data.
Scientific Software (SCI) Non real time software that involves significant computations
and scientific analysis.
Ex: Environment Simulations, Offline Data Analysis, Vehicle
Control Simulators.
Training Software (TRN) Hardware and software that are used for educational and
training purposes.
Ex: Onboard or Deliverable Training Equipment & Software,
Computer-Based Training.
DataAssessmentandProcessing42
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Table16ProductivityTypes
PT Description
Telecommunications (TEL) The transmission of information, e.g. voice, data, commands,
images, and video across different mediums and distances.
Primarily software systems that control or manage transmitters,
receivers and communications channels.
Ex: switches, routers, integrated circuits, multiplexing, encryption,
broadcasting, protocols, transfer modes, etc.
Software Tools (TOOL) Software that is used for analysis, design, construction, or testing
of computer programs.
Ex: Integrated collection of tools for most development phases of
the life cycle, e.g. Rational development environment.
Test Software (TST) Hardware & Software necessary to operate and maintain
systems and subsystems which are not consumed during the
testing phase and are not allocated to a specific phase of
testing.
Ex: Onboard or Deliverable Test Equipment & Software.
Intelligence & Information An assembly of software applications that allows a properly
Software (IIS) designated authority to exercise control over the
accomplishment of the mission. Humans manage a dynamic
situation and respond to user-input in real time to facilitate
coordination and cooperation.
Ex: Battle Management, Mission Control. Also, software that
manipulates, transports and stores information.
Ex: Database, Data Distribution, Information Processing, Internet,
Entertainment, Enterprise Services*, Enterprise Information**.
* Enterprise Information HW & SW needed for developing functionality or software service
(subtype of IIS) that are unassociated, loosely coupled units of functionality.
Examples are: Enterprise service management (monitoring, fault
management), Machine-to-machine messaging, Service
discovery, People and device discovery, Metadata discovery,
Mediation, Service security, Content discovery and delivery,
Federated search, Enterprise catalog service, Data source
integration, Enterprise content delivery network (caching
specification, distributed caching, forward staging), Session
management,, Audio & video over internet protocol, Text
collaboration (chat, instant messaging), Collaboration (white
boarding & annotation), Application broadcasting and sharing,
Virtual spaces, Identity management (people and device
discovery), User profiling and customization.
** Enterprise Information HW & SW needed for assessing and tailoring COTS software
(subtype of IIS) applications or modules that can be attributed to a specific
software service or bundle of services.
Examples of enterprise information systems include but not
limited to: , Enterprise resource planning, Enterprise data
warehouse, Data mart, Operational data store.
Examples of business / functional areas include but not limited to:
General ledger, Accounts payable, Revenue and accounts
receivable, Funds control and budgetary accounting, Cost
management, Financial reporting, Real property inventory and
management.
DataAssessmentandProcessing43
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
5.2.2.1 Finding the Productivity Type
Itcanbechallengingtodeterminewhichproductivitytypeshouldbeusedtoestimatethecost
andscheduleofanapplication(thatpartofthehardwaresoftwarecomplexwhichcomprisea
domain).Theproductivitytypesarebydesigngeneric.Byusingaworkbreakdownstructure
(WBS),theenvironmentanddomainareusedtodeterminetheproductivitytype.
UsingtheWBSfromMILSTD881C,amappingiscreatedfromenvironmenttoProductivity
Type(PT),Table17.Startingwiththeenvironment,traversetheWBStothelowestlevelwhere
thedomainisrepresented.EachdomainisassociatedwithaProductivityType(PT).Inreal
worldWBSs,thetraversefromenvironmenttoPTwillmostlikelynotbethesamenumberof
levels.Howeverthe881CWBSprovidesthecontextforselectingthePTwhichshouldbe
transferabletootherWBSs.
Twoexamplesforfindingtheproductivitytypeusingthe881CAerialVehicleManned(AVM)
andSpaceVehicleUnmanned(SVU)WBSelementsareprovidedbelow.ThehighestlevelWBS
elementrepresentstheenvironment.IntheAVMenvironmenttherearetheAvionics
subsystem,FireControlsubsubsystem,andthesensor,navigation,airdata,display,bombing
computerandsafetydomains.Eachdomainhasanassociatedproductivitytype.
Table17AerialVehicleMannedtoPTExample
Foraspacesystem,thehighestlevel881CWBSelementistheSpaceVehicleUnmanned(SVU).
ThetwosubsystemsareBusandPayload.ThedomainsforBusaddresscontrollingthevehicle.
ThedomainsforPayloadaddresscontrollingtheonboardequipment.Eachdomainhasan
associatedproductivitytype,Table18.
DataAssessmentandProcessing44
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Table18SpaceVehicleUnmannedtoPTExample
ThefulltableisavailablefortheMILSTD881CWBSMappingtoProductivityTypes,
Appendix9.5.
DataAssessmentandProcessing45
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
CostEstimatingRelationshipAnalysis46
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Activities in SRDR
Software architectural design
Software detailed design
data
Software coding and testing
Software integration
Software qualification testing
System integration
System qualification testing
Software installation
Software acceptance support
Table20showsthedifferentlaborcategoriesintheSRDRdata.Notalloftherecordshadallof
thecategories.However,theSoftwareEngineeringandAssessmentcategorieswerereported
forineachrecord.Table14inChapter5.1.5.3providesadistributionofeffortacrossthese
activities.
Table 20 SRDR Labor Categories
Category SRDR Labor Categories
Engineering Management
Management
Business Management
Software Requirements Analysis
Architecture and Detailed Design
Software Engineering
Coding and Unit Testing
Test and Integration
Qualification Testing
Assessment
Development Test Evaluation Support
Software Configuration Management
Software Quality Assurance
Configuration Audit
Development Environment Support
Tools Support
Support Documentation
Data Preparation
Process Management
Metrics
Training
IT Support / Data Center
WhencomparingresultsoftheCERanalysiswithotheravailableCERdata,itisimportantto
keepinmindthebreadthanddepthofactivitiescovered.Theyshouldbeassimilaraspossible.
CostEstimatingRelationshipAnalysis47
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
CostEstimatingRelationshipAnalysis48
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
B
Eq 17 Effort (PM) = C + (A x KESLOC )
Or
B
Eq 18 Effort (PM) = C + (KESLOC )
Where
EffortisPersonMonths
CisthefixedstartupandoverheadactivitycostsinPersonMonths
Bisascalingfactorexpressingthedegreeofthediseconomyofscale
IftheNLMshowsaBexponent>1.0,thentheNLMischosen.Thismodelunmasksthe
influenceoffixedstartupcostsinaseparatevariablefromthediseconomiesofscalepresentin
thedata.
AstatisticthatisnotavailableforNLMsisR2,theCoefficientofDetermination,usedtodescribe
howwellaregressionfitsasetofdata.Thisisduetonotbeingabletouseregressionanalysisto
derivetheNLM.Iterativesearchtechniquesareusedinstead.WhenaNLMisdisplayed,theR2
isdisplayedwiththemarker***.
CostEstimatingRelationshipAnalysis49
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
3 R2isnotavailableforNLMs.
CostEstimatingRelationshipAnalysis50
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
System Software
1.12
Eq 20 PMGSF-SYS = 20.86 + (2.35 x KESLOC )
Numberofobservations: 28
AdjustedR2: ***
MaximumAbsoluteDeviation: 0.19
PRED(30): 0.82
MinimumKESLOCValue: 5
MaximumKESLOCValue: 215
Scientific Systems
1.29
Eq 21 PMGSF-SCI = 34.26 + (KESLOC )
Numberofobservations: 24
AdjustedR2: ****
MaximumAbsoluteDeviation: 0.37
PRED(30): 0.56
MinimumKESLOCValue: 5
MaximumKESLOCValue: 171
Intelligence and Information Systems
1.13
Eq 22 PMGSF-IIS = 30.83 + (1.38 x KESLOC )
Numberofobservations: 23
AdjustedR2: ***
MaximumAbsoluteDeviation: 0.16
PRED(30): 0.91
MinimumKESLOCValue: 15
MaximumKESLOCValue: 180
CostEstimatingRelationshipAnalysis51
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Real Time Embedded
1.45
Eq 24 PMGV-RTE = 84.42 + (KESLOC )
Numberofobservations: 22
AdjustedR2: ***
MaximumAbsoluteDeviation: 0.24
PRED(30): 0.73
MinimumKESLOCValue: 9
MaximumKESLOCValue: 89
CostEstimatingRelationshipAnalysis52
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
6.3.3.4 Space Vehicle Unmanned (SVU) Operating Environment
Vehicle Payload
1.38
Eq 28 PMSV-VP = 3.15 x (KESLOC )
Numberofobservations: 16
AdjustedR2: 0.86
MaximumAbsoluteDeviation: 0.27
PRED(30): 0.50
MinimumKESLOCValue: 5
MaximumKESLOCValue: 120
CostEstimatingRelationshipAnalysis53
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Real Time Embedded
1.52
Eq 31 PMAll-RTE = 34.32 + (KESLOC )
Numberofobservations: 52
AdjustedR2: ***
MaximumAbsoluteDeviation: 0.61
PRED(30): 0.46
MinimumKESLOCValue: 1
MaximumKESLOCValue: 167
Mission Processing
1.17
Eq 32 PMAll-MP = 3.48 x (KESLOC )
Numberofobservations: 48
AdjustedR2: 0.88
MaximumAbsoluteDeviation: 0.49
PRED(30): 0.58
MinimumKESLOCValue: 1
MaximumKESLOCValue: 207
System Software
1.37
Eq 33 PMAll-SYS = 16.01 + (KESLOC )
Numberofobservations: 60
AdjustedR2: ***
MaximumAbsoluteDeviation: 0.37
PRED(30): 0.53
MinimumKESLOCValue: 2
MaximumKESLOCValue: 215
Scientific Software
1.36
Eq 34 PMAll-SCI = 21.09 + (KESLOC )
Numberofobservations: 39
AdjustedR2: ***
MaximumAbsoluteDeviation: 0.65
PRED(30): 0.18
MinimumKESLOCValue: 1
MaximumKESLOCValue: 171
CostEstimatingRelationshipAnalysis54
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Intelligence and Information Systems
1.18
Eq 35 PMAll-IIS = 1.27 x (KESLOC )
Numberofobservations: 37
AdjustedR2: 0.90
MaximumAbsoluteDeviation: 0.35
PRED(30): 0.65
MinimumKESLOCValue: 1
MaximumKESLOCValue: 180
FutureversionofthismanualwillshowCERscatterplots,95%confidenceintervals,aswellas
expandthenumberofCERsforproductivitytypesandoperatingenvironments.
CostEstimatingRelationshipAnalysis55
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
ThenumbersinTable22arethenumberofrecordsanalyzed.TheALLcolumnandrowarenot
necessarilythesumofthecorrespondingcolumnsorrows.Aminimumoffiveormoreprojects
wererequiredtoderiveaproductivitybenchmark.
CostEstimatingRelationshipAnalysis56
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
6.4.2 Data Transformation
AnAndersonDarlingtestoftheproductivitydatarevealedanonnormaldistribution.A
histogramwithagammadistributionoverlayvisuallyshowsthisphenomenon,leftplotin
Figure3.ABoxCoxtransformationoftheproductivitydatashowedthatifthedatawere
transformedtologvalues,thedistributionwasmuchclosertoanormaldistribution,rightplot
inFigure3.
Fr equency
30 20
15
20
10
10
5
0 0
0 150 300 450 600 750 2.4 3.2 4.0 4.8 5.6 6.4 7.2
P r oductivity Ln( P r oductivity)
Figure3ProductivityDataDistribution
Thisobservationrequiredthetestingofeachdatagroupsdistributionfornormality.Some
groupsrequireddifferenttypesoftransformationandsomedidnotrequireanytransformation
atall.TheresultsofthedistributiontestingareprovidedinAppendix9.6.1,NormalityTestson
ProductivityData.
Table24,Table25,andTable26belowshowtheproductivitybenchmarkresults.Resultsshown
initalicsindicatetheanalysiswasperformedontransformeddata.Thisisimportanttonote
becausestatisticsofdatadispersionareonlyvalidonnormaldistributions,i.e.,theyonlyapply
inthetransformednumberspace.However,dispersionstatisticsinthetransformednumber
spacedonotprovidemuchinsightwhenconvertedbackintolinearnumberspace,e.g.,
dispersionstatisticsinlognumberspacearemuchclosertogetherandconversionbackinto
linearnumberspaceresultsinafalsemeasureofdispersion.Thereforetheresultsinthesetables
arereportedinlinearnumberspace.
Themeanvalueofthetransformeddataisvalidinlinearnumberspaceandcanbecomparedto
othermeanvalues.Thedispersionstatisticsforthetransformedstatisticsare,strictlyspeaking,
onlyanindicatorofdispersion.Thestandarddeviation,onwhichthedispersionstatisticsrely,
wasderivedmanuallyinlinearnumberspace.
Thetransformationsperformedoneachdatasetandthestatisticalsummariesareprovidedin
Appendices9.6.1and9.6.2.
CostEstimatingRelationshipAnalysis57
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
6.4.3 Productivity Benchmark Statistics
ThetablesofproductivityresultshaveanumberofcolumnsthataredefinedinTable23.
Table 23 Productivity Statistics
Column Label Description
N Number of records
Min KESLOC Minimum value in thousands of equivalent source lines of code
Max KESLOC Maximum value in thousands of equivalent source lines of code
LCI Lower Confidence Interval is an estimate of an interval below the sample
mean within which the population mean is estimated to lie
Mean Estimated sample value representing the population central value; equal to
the sum of the values divided by the number of values , i.e., arithmetic mean
UCI Upper Confidence Interval is an estimate of an interval above the sample
mean within which the population mean is estimated to lie
Std Dev Standard Deviation is a measure of dispersion about the mean
CV Coefficient of Variation shows the extent of variability in relation to the mean of
the sample. It is defined as the ratio of the standard deviation to the mean.
Q1 Numerical value for the lower 25% of ranked data (1st Quartile), i.e., the value
half way between the lowest value and the median in a set of ranked values
Median Numerical value separating the higher half of a sample from the lower half, i.e.,
the middle value in a set of ranked values
Q3 Numerical value for the lower 75% of ranked data (3rd Quartile), i.e. the value
half way between the median and the highest value in a set of ranked values
Note:Resultsshowninitalicsindicatetheanalysiswasperformedontransformeddata.See
discussionin6.3.2.
UsingthemedianvaluesinTable24,Figure4showsacomparisonoftheproductivitiesacross
operatingenvironments.
CostEstimatingRelationshipAnalysis58
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
700
600
500
400
Pr
300
200
100
0
AVM GSF MVM OVU SVU
OpEnv
Figure4OpEnvMedianProductivitiesBoxplot
Note:Resultsshowninitalicsindicatetheanalysiswasperformedontransformeddata.See
discussionin6.3.2.
UsingthemedianvaluesinTable25,Figure5showsacomparisonoftheproductivitiesacross
operatingenvironments.
CostEstimatingRelationshipAnalysis59
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
800
700
600
500
Pr
400
300
200
100
Figure5PTMedianProductivitiesBoxplot
CostEstimatingRelationshipAnalysis60
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
6.4.6 Software Productivity Benchmarks by OpEnv and PT
Table26showsthemeanandmedianproductivitybyoperatingenvironment(OpEnv)and
productivitytype(PT).Tobeincludedinthetable,therehadtobefiveormorerecordsina
productivitytypegroup.Therowsaresortedonthemeanproductivityfromlowesttohighest
ineachOpEnvgrouping.
Table 26 Productivity Benchmarks by Operating Environment and Productivity Type
Min Max Std
OpEnv PT N KESLOC KESLOC LCI Mean UCI Dev CV Q1 Median Q3
AVM SCP 8 6 162 47 56 65 13 23% 46 58 68
AVM RTE 9 1.5 167 69 122 174 80 66% 81 89 237
AVM MP 31 1.25 207 123 154 186 90 58% 122 141 188
GSF SCP 13 0.61 76 49 58 67 17 29% 44 60 67
GSF RTE 23 9.4 89 110 129 147 45 35% 99 130 149
GSF MP 6 14.5 91 120 162 203 52 32% 126 158 199
GSF SYS 28 5.1 215 217 240 264 64 26% 199 245 274
GSF SCI 23 4.5 171 230 271 312 99 37% 201 274 313
GSF IIS 23 15.2 180 341 376 410 85 23% 305 359 436
MVM SCP 7 5.99 24 25 39 53 19 50% 24 33 46
MVM RTE 6 2.21 38 -13 113 239 158 139% 62 110 247
MVM SCI 15 2 54 119 185 251 131 71% 55 170 327
MVM MP 7 4.931 13 150 189 228 52 28% 136 217 241
MVM SYS 23 2.05 47 199 234 269 86 37% 184 226 324
OVU RTE 11 1.2 116 96 141 410 76 54% 56 127 192
Note:Resultsshowninitalicsindicatetheanalysiswasperformedontransformeddata.See
discussionin6.3.2.
A future version of this manual will use additional data to examine productivity changes within
an operating environment and productivity type.
CostEstimatingRelationshipAnalysis61
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
ModernEstimationChallenges62
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Forincrementalandevolutionarydevelopmentprojects,thesecondoptionistotreattheearlier
incrementsasreusedsoftware,andtoapplyreusefactorstothem(suchasthepercentofthe
design,code,andintegrationmodified,perhapsadjustedfordegreeofsoftware
understandabilityandprogrammerunfamiliarity[Boehmetal.2000]).Thiscanbedoneeither
uniformlyacrossthesetofpreviousincrements,ofbyhavingthesefactorsvarybyprevious
incrementorbysubsystem.ThiswillproduceanequivalentSLOC(ESLOC)sizefortheeffectof
modifyingthepreviousincrements,tobeaddedtothesizeofthenewincrementinestimating
effortforthenewincrement.Intrackingthesizeoftheoverallsystem,itisimportantto
rememberthattheseESLOCarenotactuallinesofcodetobeincludedinthesizeofthenext
release.
ThethirdoptionistoincludeanIncrementalDevelopmentProductivityDecline(IDPD)factor,
orperhapsmultiplefactorsvaryingbyincrementorsubsystem.Unlikehardware,whereunit
coststendtodecreasewithaddedproductionvolume,theunitcostsoflatersoftware
incrementstendtoincrease,duetopreviousincrementbreakageandusagefeedback,anddue
toincreasedintegrationandtesteffort.Thus,usinghardwaredrivenortraditionalsoftware
drivenestimationmethodsforlaterincrementswillleadtounderestimatesandoverrunsin
bothcostandschedule.
Arelevantexamplewasalargedefensesoftwaresystemthathadthefollowingcharacteristics:
5builds,7years,$100M
Build1productivityover300SLOC/personmonth
Build5productivityunder150SLOC/personmonth
IncludingBuild14breakage,integration,rework
318%changeinrequirementsacrossallbuilds
Afactorof2decreaseinproductivityacrossfournewbuildscorrespondstoanaveragebuild
tobuildIDPDfactorof19%.ArecentquantitativeIDPDanalysisofasmallersoftwaresystem
yieldedanIDPDof14%,withsignificantvariationsfromincrementtoincrement[Tanetal.
2009].SimilarIDPDphenomenahavebeenfoundforlargecommercialsoftwaresuchasthe
multiyearslippagesinthedeliveryofMicrosoftsWordforWindows[GillIansiti1994]and
WindowsVista,andforlargeagiledevelopmentprojectsthatassumedazeroIDPDfactor
[ElssamadisySchalliol2002].
Basedonexperiencewithsimilarprojects,thefollowingimpactcausesandrangesper
incrementareconservativelystatedinTable28:
Table28IDPDEffortDrivers
Less effort due to more experienced personnel, assuming reasonable
initial experience level
Variation depending on personnel turnover rates 5-20%
More effort due to code base growth
Breakage, maintenance of full code base 20-40%
Diseconomies of scale in development, integration 10-25%
Requirements volatility, user requests 10-25%
ModernEstimationChallenges63
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Inthebestcase,therewouldbe20%moreeffort(fromabove20+20+10+10);fora4build
system,theIDPDwouldbe6%.
Intheworstcase,therewouldbe85%moreeffort(fromabove40+25+255);fora4buildsystem,
theIDPDwouldbe23%.
Inanycase,withfixedstaffsize,therewouldbeeitherascheduleincreaseorincompletebuilds.
Thedifferencebetween6%and23%maynotlooktooserious,butthecumulativeeffectson
scheduleacrossanumberofbuildsisveryserious.
Asimplifiedillustrativemodelrelatingproductivitydeclinetonumberofbuildsneededto
reach4MESLOCacross4buildsfollows.AssumethatthetwoyearBuild1productionof1M
SLOCcanbedevelopedat200SLOC/PM.Thismeansitwillneed208developers(500PM/24
mo.).Assumingaconstantstaffsizeof208forallbuilds.TheanalysisshowninFigure6shows
theimpactontheamountofsoftwaredeliveredperbuildandtheresultingeffectontheoverall
deliveryscheduleasafunctionoftheIDPDfactor.Manyincrementaldevelopmentcost
estimatesassumeanIDPDofzero,andanontimedeliveryof4MSLOCin4builds.However,
astheIDPDfactorincreasesandthestaffinglevelremainsconstant,theproductivitydeclineper
buildstretchesthescheduleouttotwiceaslongforanIDPDof20%.
Thus,itisimportanttounderstandtheIDPDfactoranditsinfluencewhendoingincrementalor
evolutionarydevelopment.OngoingresearchindicatesthatthemagnitudeoftheIDPDfactor
mayvarybytypeofapplication(infrastructuresoftwarehavinghigherIDPDssinceittendsto
betightlycoupledandtoucheseverything;applicationssoftwarehavinglowerIDPDsifitis
architectedtobelooselycoupled),orbyrecencyofthebuild(olderbuildsmaybemorestable).
Furtherdatacollectionandanalysiswouldbeveryhelpfulinimprovingtheunderstandingof
theIDPDfactor.
20000
18000
16000
14000
Cumulative 12000
KSLOC 10000
0% productivity decline
8000 10% productivity decline
6000 15% productivity decline
20% productivity decline
4000
2000
0
1 2 3 4 5 6 7 8
Build
Figure6EffectsofIDPDonNumberofBuildstoachieve4MSLOC
ModernEstimationChallenges64
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
7.1.2 Net-centric Systems of Systems (NCSoS)
IfoneisdevelopingsoftwarecomponentsforuseinaNCSoS,changesintheinterfacesbetween
thecomponentsystemsandindependentlyevolvingNCSoSinternalorNCSoSexternal
systemswilladdfurthereffort.Theamountofeffortmayvarybythetightnessofthecoupling
amongthesystems;thecomplexity,dynamism,andcompatibilityofpurposeofthe
independentlyevolvingsystems;andthedegreeofcontrolthattheNCSoSprotagonisthasover
thevariouscomponentsystems.ThelatterrangesfromDirectedSoS(strongcontrol),through
Acknowledged(partialcontrol)andCollaborative(sharedinterests)SoSs,toVirtualSoSs(no
guarantees)[USD(AT&L)2008].
Forestimation,oneoptionistouserequirementsvolatilityasawaytoassessincreasedeffort.
AnotheristouseexistingmodelssuchasCOSYSMO[Valerdi2008]toestimatetheadded
coordinationeffortacrosstheNCSoS[Lane2009].Athirdapproachistohaveseparatemodels
forestimatingthesystemsengineering,NCSoScomponentsystemsdevelopment,andNCSoS
componentsystemsintegrationtoestimatetheaddedeffort[LaneBoehm2007].
70 50%
60 Sp08,
Fa06
Percentage
50 35%
40 Sp07
30 19%
20
10
0
1997 1998 1999 2000 2001 2002
Year
Year
Figure7COTSandServicesIntensiveSystemsGrowthinUSCEServicesProjects
Suchapplicationsarehighlycosteffective,butpresentseveralsizingandcostestimation
challenges:
ModeldirectivesgeneratesourcecodeinJava,C++,orotherthirdgenerationlanguages,but
unlessthegeneratedSLOCaregoingtobeusedforsystemmaintenance,theirsizeas
countedbycodecountersshouldnotbeusedfordevelopmentormaintenancecost
estimation.
ModernEstimationChallenges65
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Countingmodeldirectivesispossibleforsometypesofmodeldrivendevelopment,but
presentssignificantchallengesforothers(e.g.,GUIbuilders).
Exceptforcustomerfurnishedoropensourcesoftwarethatisexpectedtobemodified,the
sizeofNDIcomponentsshouldnotbeusedforestimating.
AsignificantchallengeistofindappropriatelyeffectivesizemeasuresforsuchNDI
components.Oneapproachistousethenumberandcomplexityoftheirinterfaceswith
eachotherorwiththesoftwarebeingdeveloped.Anotheristocounttheamountofglue
codeSLOCbeingdevelopedtointegratetheNDIcomponents,withtheprovisothatsuch
gluecodetendstobeabout3timesasexpensiveperSLOCasregularlydevelopedcode
[BasiliBoehm,2001].Asimilarapproachistousetheinterfaceelementsoffunctionpoints
forsizing[GalorathEvans2006].
AfurtherchallengeisthatmuchoftheeffortinusingNDIisexpendedinassessing
candidateNDIcomponentsandintailoringthemtothegivenapplication.Someinitial
guidelinesforestimatingsucheffortareprovidedintheCOCOTSmodel[Abts2004].
AnotherchallengeisthattheeffectsofCOTSandCloudservicesevolutionaregenerally
underestimatedduringsoftwaremaintenance.COTSproductsgenerallyprovidesignificant
newreleasesontheaverageofaboutevery10months,andgenerallybecomeunsupported
afterthreenewreleases.WithCloudservices,onedoesnothavetheoptiontodeclinenew
releases,andupdatesoccurmorefrequently.Onewaytoestimatethissourceofeffortisto
consideritasaformofrequirementsvolatility.
Anotherseriousconcernisthatfunctionalsizemeasuressuchasfunctionpoints,usecases,
orrequirementswillbehighlyunreliableuntilitisknownhowmuchofthefunctionalityis
goingtobeprovidedbyNDIcomponentsorCloudservices.
ModernEstimationChallenges67
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
7.1.6 Agile and Kanban Development
Thedifficultiesofsoftwaremaintenanceestimationcanoftenbemitigatedbyusingworkflow
managementtechniquessuchasKanban[Anderson2010].InKanban,individualmaintenance
upgradesaregivenKanbancards(KanbanistheJapanesewordforcard;theapproach
originatedwiththeToyotaProductionSystem).Workflowmanagementisaccomplishedby
limitingthenumberofcardsintroducedintothedevelopmentprocess,andpullingthecards
intothenextstageofdevelopment(design,code,test,release)whenopencapacityisavailable
(eachstagehasalimitofthenumberofcardsitcanbeprocessingatagiventime).Any
buildupsofupgradequeueswaitingtobepulledforwardaregivenmanagementattentionto
findandfixbottleneckrootcausesortorebalancethemanpowerdevotedtoeachstageof
development.AkeyKanbanprincipleistominimizeworkinprogress.
AnadvantageofKanbanisthatifupgraderequestsarerelativelysmallanduniform,thatthere
isnoneedtoestimatetheirrequiredeffort;theyarepulledthroughthestagesascapacityis
available,andifthecapacitiesofthestagesarewelltunedtothetraffic,workgetsdoneon
schedule.However,ifatoolargeupgradeisintroducedintothesystem,itislikelytointroduce
delaysasitprogressesthroughthestages.Thus,someformofestimationisnecessaryto
determinerightsizeupgradeunits,butitdoesnothavetobepreciseaslongastheworkflow
managementpullstheupgradethroughthestages.Forfamiliarsystems,performerswillbeable
torightsizetheunits.ForKanbaninlessfamiliarsystems,andforsizingbuildsinagile
methodssuchasScrum,groupconsensustechniquessuchasPlanningPoker[Cohn2005]or
WidebandDelphi[Boehm1981]cangenerallyservethispurpose.
Thekeypointhereistorecognizethatestimationofknowledgeworkcanneverbeperfect,and
tocreatedevelopmentapproachesthatcompensateforvariationsinestimationaccuracy.
Kanbanisonesuch;anotheristheagilemethodsapproachoftimeboxingorscheduleas
independentvariable(SAIV),inwhichmaintenanceupgradesorincrementaldevelopment
featuresareprioritized,andtheincrementarchitectedtoenabledroppingoffeaturestomeeta
fixeddeliverydate(WithKanban,prioritizationoccursindeterminingwhichofabacklogof
desiredupgradefeaturesgetsthenextcard).Suchprioritizationisaformofvaluebased
softwareengineering,inthatthehigherpriorityfeaturescanbeflowedmorerapidlythrough
Kanbanstages[Anderson2010],oringeneralgivenmoreattentionindefectdetectionand
removalviavaluebasedinspectionsortesting[BoehmLee2005;LiBoehm2010].Another
importantpointisthattheabilitytocompensateforroughestimatesdoesnotmeanthatdataon
projectperformancedoesnotneedtobecollectedandanalyzed.Itisevenmoreimportantasa
soundsourceofcontinuousimprovementandchangeadaptabilityefforts.
Figure8SummaryofDifferentProcesses
ModernEstimationChallenges69
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
TheSingleStepmodelisthetraditionalwaterfallmodel,inwhichtherequirementsarepre
specified,andthesystemisdevelopedtotherequirementsinasingleincrement.Single
incrementparametricestimationmodels,complementedbyexpertjudgment,arebestforthis
process.
ThePrespecifiedSequentialincrementaldevelopmentmodelisnotevolutionary.Itjustsplits
upthedevelopmentinordertofieldanearlyInitialOperationalCapability,followedbyseveral
PrePlannedProductImprovements(P3Is).Whenrequirementsareprespecifiableandstable,it
enablesastrong,predictableprocess.Whenrequirementsareemergentand/orrapidly
changing,itoftenrequiresveryexpensivereworkwhenitneedstoundoarchitectural
commitments.Costestimationcanbeperformedbysequentialapplicationofsinglestep
parametricmodelsplustheuseofanIDPDfactor,orbyparametricmodelextensions
supportingtheestimationofincrements,includingoptionsforincrementoverlapandbreakage
ofexistingincrements,suchastheextensionofCOCOMOIIIncrementalDevelopmentModel
(COINCOMO)extensiondescribedinAppendixBof[Boehmetal.2000].
TheEvolutionarySequentialmodelrapidlydevelopsaninitialoperationalcapabilityand
upgradesitbasedonoperationalexperience.Pureagilesoftwaredevelopmentfitsthismodel:if
somethingiswrong,itwillbefixedin30daysinthenextrelease.Rapidfieldingalsofitsthis
modelforlargerorhardwaresoftwaresystems.Itsstrengthisgettingquickresponse
capabilitiesinthefield.Forpureagile,itcanfallpreytoaneasiestfirstsetofarchitectural
commitmentswhichbreakwhen,forexample,ittriestoaddsecurityorscalabilityasanew
featureinalaterincrement.Forrapidfielding,itmaybeexpensivetokeepthedevelopment
teamtogetherwhilewaitingforusagefeedback,butitmaybeworthit.Forsmallagileprojects,
groupconsensustechniquessuchasPlanningPokerarebest;forlargerprojects,parametric
modelswithanIDPDfactorarebest.
EvolutionaryOverlappedcoversthespecialcaseofdeferringthenextincrementuntilcritical
enablerssuchasdesirednewtechnology,anticipatednewcommercialproductcapabilities,or
neededfundingbecomeavailableormatureenoughtobeadded.
EvolutionaryConcurrenthasthesystemsengineershandlingthechangetrafficandre
baseliningtheplansandspecificationsforthenextincrement,whilekeepingthedevelopment
stabilizedforthecurrentincrement.ItsexampleandprosandconsareprovidedinTable29.
ModernEstimationChallenges70
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Table29SituationDependentProcessesandEstimationApproaches
Type Examples Pros Cons Cost Estimation
Single Step Stable; High Pre-specifiable Emergent Single-increment
Assurance full-capability requirements or parametric
requirements rapid change estimation models
Pre-specified Platform base Pre-specifiable Emergent COINCOMO or
Sequential plus PPPIs full-capability requirements or repeated single-
requirements rapid change increment parametric
model estimation
with IDPD
Evolutionary Small: Agile Adaptability to Easiest-first; late, Small: Planning-
Sequential Large: change costly breakage poker-type
Evolutionary Large: Parametric
Development with IDPD and
Requirements
Volatility
Evolutionary COTS-intensive Immaturity risk Delay may be Parametric with IDPD
Overlapped systems avoidance noncompetitive and Requirements
Volatility
Evolutionary Mainstream High assurance Highly coupled COINCOMO with
Concurrent product lines; with rapid systems with very IDPD for
Systems of change rapid change development;
systems COSYSMO for re-
baselining
AllCostEstimationapproachesalsoincludeanexpertjudgmentcrosscheck.
Table30providescriteriafordecidingwhichofthefiveclassesofincrementalandevolutionary
acquisition(EvA)definedinTable29touse,plusthechoiceofnonincremental,singlestep
development.
TheSingleSteptoFullCapabilityprocessexemplifiedbythetraditionalwaterfallorsequential
Veemodelisappropriateiftheproductsrequirementsareprespecifiableandhavealow
probabilityofsignificantchange;andifthereisnovalueinoropportunitytodeliverapartial
productcapability.Agoodexamplewouldbethehardwareportionofageosynchronous
satellite.
ThePrespecifiedSequentialprocessisbestiftheproductsrequirementsareprespecifiable
andhavealowprobabilityofsignificantchange;andifwaitingforthefullsystemtobe
developedincursalossofimportantanddeliverableincrementalmissioncapabilities.Agood
examplewouldbeawellunderstoodandwellprioritizedsequenceofsoftwareupgradestoa
programmableradio.
TheEvolutionarySequentialprocessisbestwhenthereisaneedtogetoperationalfeedbackon
aquickresponsecapabilitybeforedefininganddevelopingthenextincrementscontent.Agile
methodsfitintothiscategory,asdosystemsundergoingrapidcompetitivechange.
TheEvolutionaryOverlappedprocessisbestwhenonedoesnotneedtowaitforoperational
feedback,butmayneedtowaitfornextincrementenablerssuchastechnologymaturity,
ModernEstimationChallenges71
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
externalsystemcapabilities,orneededresources.Agoodexampleistheneedtowaitfora
maturereleaseofananticipatedcommercialproduct.TheEvolutionaryConcurrentprocessis
bestwhentheenablersareavailable,butthereisagreatdealofchangetraffictobehandledthat
woulddestabilizetheteamdevelopingthecurrentincrement.Examplesmaybenew
competitivethreats,emergentusercapabilityneeds,externalsysteminterfacechanges,
technologymaturedonotherprograms,orCOTSupgrades.
Table 30 Process Model Decision Table
Stable pre- OK to wait for full Need to wait for Need to wait for
specifiable system to be next-increment next-increment
Type requirements? developed? priorities? enablers?
Single Step Yes Yes
Pre-specified
Yes No
Sequential
Evolutionary
No No Yes
Sequential
Evolutionary
No No No Yes
Overlapped
Evolutionary
No No No No
Concurrent
Exampleenablers:Technologymaturity;Externalsystemcapabilities;Neededresources
ModernEstimationChallenges72
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Thismanualisstillaworkinprogress.TherearemoreSRDRrecordstobeanalyzed.Withmore
data,theresultspresentedwouldbeexpandedandrefined.Thelistbelowpresentsanoverview
ofthenextsteps:
ExpandtheSLOCcounttypeconversionsforeachofthesixprogramminglanguages.
Expandtheadaptedcodeparametertabletoadditionalproductivitytypesandoperating
environments.
Expandtheaverageeffortpercentagestable,Table14,toadditionalproductivitytypes.
Analysisofscheduledurationforthedifferentactivitieswillbeconducted.
ExpandthenumberofCERsforeachproductivitytypesandoperatingenvironments.
Increasethecoverageoftheproductivitybenchmarksforeachoperatingenvironmentand
productivitytype.
Segmenttheproductivitybenchmarksbysoftwaresizegroups.
Therearetwoadditionalchaptersthatshouldbeinthismanual:
1. SoftwareCodeGrowth
ThisisaveryimportanttopicasitwillimpactthesensitivitiesofaCERbasedestimate.
2. SoftwareMaintenanceCERsandProductivityBenchmarks
WiththesuperlargebaseofDoDsoftware,maintenancecostestimationsareanother
importanttopic.
ConclusionsandNextSteps73
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
9 Appendices
9.1 Acronyms
4GL FourthGenerationLanguage
AAF AdaptationAdjustmentFactor:Itisusedwithadaptedsoftwareto
produceanequivalentsize.ItincludestheeffectsofDesign
Modified(DM),CodeModified(CM),andIntegrationModified
(IM).
AAM AdaptationAdjustmentMultiplier
ACAT AcquisitionCategory
ACEIT AutomatedCostEstimatingIntegratedTools
ACWP ActualCostofWorkPerformed
AMS AcquisitionManagementSystem
ASP AcquisitionSupportPlan
AV AerialVehicle
AVM AerialVehicleManned,e.g.,Fixedwingaircraft,Helicopters
AVU AerialVehicleUnmanned,e.g.,Remotelypilotedairvehicles
BCWP BudgetedCostofWorkPerformed
BCWS BudgetedCostofWorkScheduled
BFP BasicFeaturePoint
C/SCSC Costs/ScheduleControlSystemCriteria
CAPE CostAssessmentandProgramEvaluation(anOSDorganization)
CARD CostAnalysisRequirementsDocument
CDA CentralDesignAuthority
CDD CapabilityDescriptionDocument
CDR CriticalDesignReview
CDRL ContractDataRequirementsList
CER CostEstimatingRelationship
CER CostEstimatingRelationship
CM CodeModifiedPercentage
CMM CapabilityMaturityModel
Appendices74
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
CO ContractingOfficer
COCOMO COnstructiveCOstMOdel
COCOTS COnstructiveCOTS
COTS CommercialofftheShelf
CPM CriticalPathMethod
CSC ComputerSoftwareComponent
CSCI ComputerSoftwareConfigurationItem
CSDR CostandSoftwareDataReport
CSU ComputerSoftwareUnit
DACIMS DefenseAutomatedCostInformationManagementSystem
DCARC DefenseCostandResourceCenter
DDE DynamicDataExchange
DM DesignModifiedPercentage
DoD DepartmentofDefense
EA EvolutionaryAcquisition
EI ExternalInputs
EIF ExternalInterfaces
EO ExternalOutputs
EQ ExternalInquiries
EVMS EarnedValueManagementSystem
FAACEH FAACostEstimatingHandbook
FAAPH FAAPricingHandbook
FAQ FrequentlyAskedQuestions
FCA FunctionalConfigurationAudit
FPA FunctionPointAnalysis
FPC FunctionPointCount
FPH FAAPricingHandbook
GAO U.S.GeneralAccountingOffice
GS GroundSite
GSF GroundSiteFixed,e.g.,CommandPost,GroundOperations
Center,GroundTerminal,TestFaculties
Appendices75
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
GSM GroundSiteMobile,e.g.,Intelligencegatheringstationsmounted
onvehicles,Mobilemissilelauncher
GUI GraphicalUserInterface
GV GroundVehicle
GVM GroundVehicleManned,e.g.,Tanks,Howitzers,Personnelcarrier
GVU GroundVehicleUnmanned,e.g.,Robotvehicles
HOL HigherOrderLanguage
HWCI HardwareConfigurationitem
IDPD IncrementalDevelopmentProductivityDecline
ICE IndependentCostEstimate
IEEE InstituteofElectricalandElectronicsEngineers
IFPUG InternationalFunctionPointUsersGroup
IIS IntelligenceandInformationSoftware
ILF InternalFiles
IM IntegrationModifiedPercentage
IRS InterfaceRequirementSpecification
IS InformationSystem
KDSI ThousandsofDeliveredSourceInstructions
LCC LifeCycleCost
MDAP MajorDefenseAcquisitionProgram
MOU MemorandumofUnderstanding
MPSS MostProductiveScaleSize
MTTD MeanTimeToDetect
MP MissionProcessing
MSLOC Millionsofsourcelinesofcode
MV MaritimeVessel
MVM MaritimeVesselManned,e.g.,Aircraftcarriers,destroyers,supply
ships,submarines
MVU MaritimeVesselUnmanned,e.g.,Minehuntingsystems,Towed
sonararray
NASA NationalAeronauticsandSpaceAdministration
NCCA NavalCenterforCostAnalysis
Appendices76
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
NDI NonDevelopmentItem
NLM NonLinearModel
NRaD UnitedStatesNavysNavalCommand,Control,Surveillance
Center,RDT&EDivision,SoftwareEngineeringProcessOffice
OO ObjectOriented
OpEnv OperatingEnvironment
OSD OfficeoftheSecretaryofDefense
OV OrdinanceVehicle
OVU OrdinanceVehicleUnmanned,e.g.,Airtoairmissiles,Airto
groundmissiles,Smartbombs,Strategicmissiles
PCA PhysicalConfigurationAudit
PERT ProgramEvaluationandReviewTechnique
PC ProcessControl
PLN Planningsoftware
Pr Productivity
PT ProductivityType
RTE RealTimeEmbedded
RUP RationalUnifiedProcess
SAIV ScheduleAsanIndependentVariable
SCI Scientificsoftware
SCP SensorControlandSignalProcessing(SCP)
SDD SoftwareDesignDocument
SDP SoftwareDevelopmentPlan
SDR SoftwareDesignReview
SEER AtoolsuiteproducedbyGalorath
SEERSEM SystemEvaluationandEstimationofResourcesSoftware
EstimatingModel
SEI SoftwareEngineeringInstitute
SER ScheduleEstimatingRelationship
SLIM AtoolsuiteproducedbyQuantitativeSoftwareManagement
SLIM SoftwareLifeCycleModel
SLOC SourceLinesofCode
Appendices77
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
SRDR SoftwareResourceDataReport
SRR SystemsRequirementsReview
SRS SoftwareRequirementsSpecification
SSCAG SpaceSystemsCostAnalysisGroup
SSR SoftwareSpecificationReview
SSS SystemSegmentSpecification
SU SoftwareUnderstanding
SV SpaceVehicle
SVM SpaceVehicleManned,e.g.,Passengervehicle,Cargovehicle,
Spacestation
SVU SpaceVehicleUnmanned,e.g.,Orbitingsatellites(weather,
communications),Exploratoryspacevehicles
SYS SystemSoftware
TEL Telecommunicationssoftware
TOOL SoftwareTools
TST Testsoftware
TRN TrainingSoftware
UCC UniversalCodeCounter
UNFM ProgrammerUnfamiliarity
USC UniversityofSouthernCalifornia
VC VehicleControl
VP VehiclePayload
WBS WorkBreakdownStructure
Appendices78
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Figure9UnifiedCodeCountSummaryOutputExample
TheexampleinFigure9showsthesummarySLOCcountwithatotalof3,375LogicalSLOC
consistingof619datadeclarationsand2,127executableinstructions.Thisisthesoftwaresizeto
beusedforestimation,measurementofactuals,andmodelcalibration
4 http://csse.usc.edu/research/CODECOUNT/
Appendices79
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Table31RatingScaleforSoftwareUnderstanding
Very Low Low Nominal High Very High
Very low Moderately Reasonably High Strong
cohesion, low cohesion, well- cohesion, low modularity,
high high structured; coupling. information
Structure
coupling, coupling. some weak hiding in data
spaghetti areas. / control
code. structures.
No match Some Moderate Good Clear match
between correlation correlation correlation between
Application
program and between between between program and
Clarity
application program and program and program and application
world-views. application. application. application. world-views.
Obscure Some code Moderate Good code Self-
code; commentary level of code commentary descriptive
documentati and headers; commentary, and headers; code;
Self- on missing, some useful headers, useful documentati
Descriptive- obscure or documentati documentati documentati on up-to-
ness obsolete. on. on. on; some date, well-
weak areas. organized,
with design
rationale.
SU
Increment to 50 40 30 20 10
ESLOC
ProgrammerUnfamiliarity
Unfamiliarity(UNFM)quantifieshowunfamiliarwiththesoftwaretobemodifiedistheperson
modifyingit.TheUNFMfactordescribedisappliedmultiplicativelytoSUtoaccountforthe
familiarity.Forexample,apersonwhodevelopedtheadaptedsoftwareandisintimatewithit
doesnothavetoundertaketheunderstandingeffort.SeeTable32below.
Table32RatingScaleforProgrammerUnfamiliarity
UNFM Increment to ESLOC Level of Unfamiliarity
0.0 Completely familiar
0.2 Mostly familiar
0.4 Somewhat familiar
0.6 Considerably familiar
0.8 Mostly unfamiliar
1.0 Completely unfamiliar
Appendices80
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
ThenonlineareffectsforSUandUNFMareaddedtothelinearapproximationgivenbyAAF
(discussedinChapter2.3.2)tocomputeESLOC.Ahigherfidelityadaptedcodeadjustment
factorisgivenbytheAdaptationAdjustmentMultiplier(AAM):
Eq 37 AAM = AAF x (1 + 0.02 x SU x UNFM) (when AAF 50%)
Eq 38 AAM = AAF + (SU x UNFM) (when AAF > 50%)
Thenewtotalequivalentsizeforsoftwarecomposedofnewandadaptedsoftwareis:
Eq 39 Total Equivalent Size = New Size + (AAM x Adapted Size)
9.3.1 Examples
Appendices81
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Caremustbetakenwhenassigningtheadaptationparametersforthelegacycodetocompute
itsequivalentsize.Forexample,thedifferencebetweenthesmallvaluesofAAF=0%andAAF
=5%isatriplingoftheequivalentsizefortheupgrade.Someregressiontestingofuntouched
legacycodeisinevitableandthefactorfor%IntegrationRequiredshouldbeinvestigated
carefullyintermsoftherelativeeffortinvolved.Thismightbedonebyquantifyingthenumber
theregressiontestsperformedandtheirmanualintensitycomparedtothetestsofnew
functionality.Ifthe%IntegrationRequiredforthelegacycodeis1%thentheadaptionfactor
foritwouldbe:
Eq 41 AAM = [0.4 x (0% DM) + 0.3 x (0% CM) + 0.3 x (1% IM)]
= 0.003
ThetotalESLOCfornew,modifiedandreused(legacy)assumingtheAAFforthemodified
codeis25%is:
Eq 42 ESLOC = 75 + (20 x 0.25) + (3 MSLOC x 0.003) = 75 + 5 + 9
= 89 KSLOC
Inthiscase,buildingontopofthelegacybaselinewas9/89orabout10%ofthework.
Appendices82
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Figure10SRDRPage1(top)
Appendices83
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Figure10SRDRPage1(bottom)
Appendices84
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Figure11SRDRPage2
Appendices85
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
9.4.1 Proposed Modifications
In2010,themodificationsinTable33wereproposedtotheDCARC.Mostofthe
recommendationswereincorporated,tosomedegree,inthe2011SRDRinstructions.Itwilltake
severalyearsbeforedataappearswiththeadditiondataitems.
TherelevanceofthissectioninthismanualisfordemonstratingthatDCARCwillevolvethe
SRDRtomeetfutureinformationneeds.SRDRusersneedtobeproactivewiththeDCARCto
expresstheirneedsfordifferenttypesofdata.TherecommendationsinTable33were
accompaniedwithexamplesofdataanalysishighlightingtheshortfallsinthecurrentdata
collection.
Table33RecommendedSRDRModifications
Current 2007 SRDR Proposed Modifications Rationale
Application Types (3.7.1 Reorganize around Operating Reduce duplication
17) Environments and Application Structure productivity analysis
Domains domains, database planning
Add Mission Criticality (add guidance
reliability and complexity in a Account for productivity
single rating scale) variations
Revisit detailed definitions of the
Application Domains
Amount of New (>25%), Add DM, CM, IM, SU, & UNFM Improve derivation of
Modified (<25% mod) factors for modified code equivalent SLOC for use in
Code Incorporate Galorath-like calibration and estimation
questionnaire Excludes COTS; more
Add IM for reused code accurate for generated
Definitions for code types code
Count at the level it will be Includes the code base for
maintained evolutionary acquisition
Deleted Code Report deleted code counts Deleting code does take
effort
Software and External Add anticipated requirements CARD realism
Interface Requirements volatility to 2630-1, 2 Traceability
Use percentage of requirements Improve calibration and
change as volatility input (SRR estimation accuracy
baseline)
Personnel Experience & Add to 2630-1 CARD realism
Turnover Expand years of experience rating Traceability
scale to 12 years Improve calibration and
estimation accuracy
Project- or CSCI-level Specify the level of data reporting Apples-to-Apples comparison
data Improved data analysis
Appendices86
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Table33RecommendedSRDRModifications
Current 2007 SRDR Proposed Modifications Rationale
All Other Direct Break into: Improve calibration and
Software Engineering Management functions estimation accuracy for
Development Effort Configuration / Environment different functions
(4.7): functions
Project Management Assessment functions
IV&V Organization functions (e.g. user &
Configuration maintainer documentation,
Management measurement, training, process
Quality Control improvement, etc.)
Problem Resolution
Library Management
Process Improvement
Measurement
Training
Documentation
Data Conversion
Customer-run
Acceptance Test
Software Delivery,
Installation &
Deployment
Product Quality: Are there better measures, e.g.: There is limited quality
Mean Time To critical Total number of priority 1, 2, 3, 4, & information
Defect (MTTD) 5 defects discovered If it is not going to be
Analogy with Similar Total number of priority 1, 2, 3, 4, & reported, why put it on the
Systems 5 defects removed form?
Appendices87
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Hydraulic Subsystem VC
Electrical Subsystem VC
Fuel Subsystem VC
Landing Gear VC
Rotor Group VC
Drive System VC
Intercoms RTE
Radar SCP
AVM Avionics
Radio SCP
Appendices88
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Bombing Computer MP
Chaff SCP
Magazines RTE
Appendices89
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Data Formatting MP
Accelerometers SCP
Stores Management MP
Payload-specific software VP
Primary Power VC
Appendices90
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Flight Termination/Mission
RTE
Termination
Propulsion software VC
Reentry System VC
Encasement
OVU Encasement Device Software MP
Device
Appendices91
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
9.5.3 Ordinance Vehicle Unmanned (OVU)
Source:MILSTD881CAppendixD:OrdinanceSystems
Payload software VP
Primary Power VC
Motor Engine VC
Propulsion software VC
Munition Software MP
Appendices92
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
9.5.4 Maritime Vessel Manned (MVM)
Source:MILSTD881CAppendixE:SeaSystems
Appendices93
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
9.5.5 Space Vehicle Manned / Unmanned (SVM/U) and Ground Site Fixed (GSF)
Source:MILSTD881CAppendixF:SpaceSystems
Propulsion VC
Sensor SCP
Appendices94
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
9.5.6 Ground Vehicle Manned and Unmanned (GVM/U)
Source:MILSTD881CAppendixG:SurfaceVehicleSystems
Appendices95
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Missile Launchers VP
Armament
Non-Lethal Weapons VP
Automatic
Ammunition MP
Handling
Navigation and
Primary RTE
Remote Piloting
Vehicle
Communications RTE
Ground Control
RTE
Systems
Remote
Control Command and
GVU C&C
System (UGV Control Subsystem
specific)
Remote Control
RTE
System Software
Appendices96
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
9.5.7 Aerial Vehicle Unmanned(AVU) & Ground Site Fixed (GSF)
Source:MILSTD881CAppendixH:UnmannedAirVehicleSystems
Hydraulic Subsystem VC
Electrical Subsystem VC
AVU Air Vehicle Vehicle Subsystems
Environmental Control VC
Landing Gear VC
Rotor Group VC
Drive System VC
Communication /
RTE
Identification
Stores Management VP
Mission Processing MP
Survivability Payload VP
Reconnaissance Payload VP
AVU Payload
Electronic Warfare Payload VP
Appendices97
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
9.5.8 Maritime Vessel Unmanned (MVU) and Maritime Vessel Manned (MVM)
Source:MILSTD881CAppendixI:UnmannedMaritimeVesselSystems
Navigation RTE
Maneuvering System VC
Emergency Systems MP
Survivability Payload VP
Intelligence, Surveillance
VP
Reconnaissance Payload
Payload
Armament / Weapons Delivery
VP
Payload
Mission Payload VP
Appendices98
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Shipboard Communication
Shipboard Subsystem RTE
MVM
Segment
Shipboard Power Subsystem VC
OVU Power VC
Flight Software VC
Appendices99
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
9.5.10 Ground Site Fixed (GSF)
Source:MILSTD881CAppendixK:AutomatedInformationSystems
Equipment TRN
Simulators SCI
CE Training
Computer Based-Application BIS
Appendices100
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Appendices101
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
9.6.1.2 Productivity Types (all Operating Environments)
Table 35 PT Productivity Normality Tests
Productivity Type (PT) N A2 P Ft
1 Intel and Information Processing (IIS) 35 0.53 0.162 Loge
2 Mission Processing (MP) 47 0.59 0.117 Loge
3 Real-Time Embedded (RTE) 53 0.17 0.927 Loge
4 Scientific Systems (SCI) 39 0.76 0.044 x1.5
5 Sensor Control and Signal Processing (SCP) 38 0.62 0.100 Not Required
6 System Software (SYS) 60 0.30 0.566 Not Required
Appendices102
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
9.6.2.1 Operating Environments
Non-Transformed Transformed with Ft
95% Confidence Interval for Median 95% Confidence Interval for Median
102.23 151.27 10.111 12.299
95% Confidence Interval for StDev 95% Confidence Interval for StDev
95% Confidence Intervals 95% Confidence Intervals
75.11 112.05 3.062 4.568
Mean Mean
Median Median
100 120 140 160 180 10.0 10.5 11.0 11.5 12.0 12.5
Appendices103
Unclassified:DistributionStatementA/ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
95% Confidence Interval for Median 95% Confidence Interval for Median
182.26 261.01 13.500 16.156
95% Confidence Interval for StDev 95% Confidence Interval for StDev
95% Confidence Intervals 95% Confidence Intervals
107.53 139.40 3.791 4.914
Mean Mean
Median Median
180 200 220 240 260 13.5 14.0 14.5 15.0 15.5 16.0 16.5
Appendices104
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
95% Confidence Interval for Median 95% Confidence Interval for Median
169.14 233.40 13.005 15.277
95% Confidence Interval for StDev 95% Confidence Interval for StDev
95% Confidence Intervals 95% Confidence Intervals
154.82 218.33 4.893 6.901
Mean Mean
Median Median
160 180 200 220 240 260 280 13.0 13.5 14.0 14.5 15.0 15.5 16.0
Appendices105
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Mean 132.50
StDev 90.30
Variance 8154.88
Skewness 0.23432
Kurtosis -1.36368
N 16
Minimum 9.95
1st Quartile 50.78
Median 119.94 Not Required
3rd Quartile 225.61
0 50 100 150 200 250 300
Maximum 277.35
95% Confidence Interval for Mean
84.38 180.62
Mean
Median
Appendices106
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Mean 84.888
StDev 43.829
Variance 1921.005
Skewness 0.509669
Kurtosis 0.074706
N 6
Minimum 27.802
1st Quartile 52.411
Median 76.020 Not Required
3rd Quartile 125.189
40 80 120 160
Maximum 152.927
95% Confidence Interval for Mean
38.892 130.884
Mean
Median
Appendices107
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
9.6.2.2 Productivity Types
Non-Transformed Transformed with Ft
95% Confidence Interval for Median 95% Confidence Interval for Median
330.50 437.97 5.8006 6.0821
95% Confidence Interval for StDev 95% Confidence Interval for StDev
95% Confidence Intervals 95% Confidence Intervals
133.13 215.64 0.2991 0.4844
Mean Mean
Median Median
350 400 450 500 5.80 5.85 5.90 5.95 6.00 6.05 6.10
Appendices108
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
95% Confidence Interval for Median 95% Confidence Interval for Median
138.37 183.91 4.9299 5.2144
95% Confidence Interval for StDev 95% Confidence Interval for StDev
95% Confidence Intervals 95% Confidence Intervals
91.74 138.67 0.4118 0.6224
Mean Mean
Median Median
140 160 180 200 220 4.9 5.0 5.1 5.2 5.3
Appendices109
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
95% Confidence Interval for Median 95% Confidence Interval for Median
107.71 142.35 4.6793 4.9583
95% Confidence Interval for StDev 95% Confidence Interval for StDev
95% Confidence Intervals 95% Confidence Intervals
60.99 89.91 0.4282 0.6312
Mean Mean
Median Median
110 120 130 140 150 160 4.65 4.70 4.75 4.80 4.85 4.90 4.95
Appendices110
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
95% Confidence Interval for Median 95% Confidence Interval for Median
167.52 280.60 2170.0 4700.3
95% Confidence Interval for StDev 95% Confidence Interval for StDev
95% Confidence Intervals 95% Confidence Intervals
97.25 153.37 2068.1 3261.3
Mean Mean
Median Median
160 180 200 220 240 260 280 2000 2500 3000 3500 4000 4500 5000
Appendices111
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Mean 49.810
StDev 19.422
Variance 377.224
Skewness -0.17597
Kurtosis -1.13890
N 38
Minimum 9.951
1st Quartile 32.964
Median 52.608 Not Required
3rd Quartile 66.526
20 40 60 80
Maximum 80.043
95% Confidence Interval for Mean
43.426 56.194
Mean
Median
40 45 50 55 60
Appendices112
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Mean 224.74
StDev 78.35
Variance 6139.38
Skewness 0.368573
Kurtosis -0.131695
N 60
Minimum 60.61
1st Quartile 171.46
Median 219.15 Not Required
3rd Quartile 274.11
80 160 240 320 400
Maximum 421.14
95% Confidence Interval for Mean
204.50 244.98
Mean
Median
Appendices113
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
9.6.2.3 Operating Environment - Productivity Type Sets
Non-Transformed Transformed with Ft
95% Confidence Interval for Median 95% Confidence Interval for Median
125.59 171.30 4.8330 5.1434
95% Confidence Interval for StDev 95% Confidence Interval for StDev
95% Confidence Intervals 95% Confidence Intervals
70.42 117.79 0.3986 0.6667
Mean Mean
Median Median
120 140 160 180 200 4.8 4.9 5.0 5.1 5.2
Appendices114
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
95% Confidence Interval for Median 95% Confidence Interval for Median
80.13 239.66 4.3834 5.4791
95% Confidence Interval for StDev 95% Confidence Interval for StDev
95% Confidence Intervals 95% Confidence Intervals
52.66 149.35 0.3779 1.0719
Mean Mean
Median Median
Appendices115
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Mean 56.078
StDev 12.840
Variance 164.866
Skewness -0.679094
Kurtosis -0.347898
N 8
Minimum 33.322
1st Quartile 46.146
Median 57.835 Not Required
3rd Quartile 67.806
40 50 60 70
Maximum 70.440
95% Confidence Interval for Mean
45.344 66.813
Mean
Median
45 50 55 60 65 70
Appendices116
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Mean 375.51
StDev 85.30
Variance 7276.65
Skewness 0.714674
Kurtosis 0.262814
N 23
Minimum 236.41
1st Quartile 304.45
Median 358.54 Not Required
3rd Quartile 435.94
300 400 500 600
Maximum 580.72
95% Confidence Interval for Mean
338.62 412.40
Mean
Median
Appendices117
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Mean 161.69
StDev 51.55
Variance 2657.58
Skewness 0.28077
Kurtosis 1.21448
N 6
Minimum 87.27
1st Quartile 125.49
Median 158.30 Not Required
3rd Quartile 199.44
80 120 160 200 240
Maximum 243.16
95% Confidence Interval for Mean
107.59 215.79
Mean
Median
Appendices118
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Mean 128.79
StDev 45.19
Variance 2042.14
Skewness 0.681898
Kurtosis 0.726621
N 23
Minimum 51.22
1st Quartile 99.29
Median 130.33 Not Required
3rd Quartile 148.53
50 100 150 200 250
Maximum 239.18
95% Confidence Interval for Mean
109.25 148.33
Mean
Median
Appendices119
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
95% Confidence Interval for Median 95% Confidence Interval for Median
246.89 307.19 60953 94388
95% Confidence Interval for StDev 95% Confidence Interval for StDev
95% Confidence Intervals 95% Confidence Intervals
75.56 138.28 34622 63359
Mean Mean
Median Median
200 220 240 260 280 300 320 50000 60000 70000 80000 90000 100000
Appendices120
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
95% Confidence Interval for Median 95% Confidence Interval for Median
48.128 66.582 2426.8 4433.3
95% Confidence Interval for StDev 95% Confidence Interval for StDev
95% Confidence Intervals 95% Confidence Intervals
11.906 27.407 1203.5 2770.5
Mean Mean
Median Median
Appendices121
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Mean 240.20
StDev 63.60
Variance 4045.48
Skewness 0.56326
Kurtosis 1.35086
N 28
Minimum 115.23
1st Quartile 199.19
Median 245.29 Not Required
3rd Quartile 274.11
100 150 200 250 300 350 400
Maximum 421.14
95% Confidence Interval for Mean
215.54 264.87
Mean
Median
Appendices122
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Mean 188.89
StDev 52.13
Variance 2717.31
Skewness -0.35267
Kurtosis -2.10327
N 7
Minimum 121.86
1st Quartile 127.32
Median 202.04 Not Required
3rd Quartile 236.43
120 140 160 180 200 220 240
Maximum 242.48
95% Confidence Interval for Mean
140.68 237.10
Mean
Median
Appendices123
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
95% Confidence Interval for Median 95% Confidence Interval for Median
46.79 349.81 3.7739 5.7760
95% Confidence Interval for StDev 95% Confidence Interval for StDev
95% Confidence Intervals 95% Confidence Intervals
93.33 366.70 0.5601 2.2009
Mean Mean
Median Median
Appendices124
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Mean 185.20
StDev 130.83
Variance 17117.26
Skewness 0.51734
Kurtosis -1.05039
N 15
Minimum 36.11
1st Quartile 54.51
Median 169.46 Not Required
3rd Quartile 326.65
100 200 300 400
Maximum 431.11
95% Confidence Interval for Mean
112.75 257.65
Mean
Median
Appendices125
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Mean 38.896
StDev 19.265
Variance 371.132
Skewness 1.48446
Kurtosis 2.51764
N 7
Minimum 20.243
1st Quartile 23.505
Median 33.071 Not Required
3rd Quartile 45.978
20 30 40 50 60 70 80
Maximum 77.186
95% Confidence Interval for Mean
21.079 56.713
Mean
Median
20 30 40 50 60
Appendices126
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Mean 234.35
StDev 85.89
Variance 7376.37
Skewness 0.515429
Kurtosis -0.673920
N 21
Minimum 103.63
1st Quartile 177.76
Median 202.15 Not Required
3rd Quartile 303.39
100 150 200 250 300 350 400
Maximum 393.13
95% Confidence Interval for Mean
195.26 273.45
Mean
Median
Appendices127
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
Mean 141.07
StDev 75.88
Variance 5758.12
Skewness 0.429912
Kurtosis -0.693482
N 11
Minimum 49.82
1st Quartile 55.68
Median 126.72 Not Required
3rd Quartile 192.35
50 100 150 200 250 300
Maximum 277.35
95% Confidence Interval for Mean
90.09 192.05
Mean
Median
Appendices128
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
9.7 References
[Abts2004] AbtsC,ExtendingtheCOCOMOIISoftwareCostModelto
EstimateEffortandScheduleforSoftwareSystemsUsing
CommercialofftheShelf(COTS)SoftwareComponents:The
COCOTSModel,PhDDissertation,DepartmentofIndustrialand
SystemsEngineering,UniversityofSouthernCalifornia,May2004
[Anderson2010] AndersonD,Kanban,BlueHolePress,2010
[BankerKemerer1989] Banker,R.andKemerer,C.,ScaleEconomiesinNewSoftware
Development.IEEETransactionsonSoftwareEngineering,Vol
15,No10,October1989.
[Beck2000] Beck,K.,ExtremeProgrammingExplained,AddisonWesley,
2000.
[Boehm1981] BoehmB.,SoftwareEngineeringEconomics.EnglewoodCliffs,NJ,
PrenticeHall,1981
[Boehmetal.2000] BoehmB.,AbtsC.,BrownW.,ChulaniS.,ClarkB.,HorowitzE.,
MadachyR.,ReiferD.,SteeceB.,SoftwareCostEstimationwith
COCOMOII,PrenticeHall,2000
[Boehmetal.2000b] BoehmB,AbtsC,ChulaniS,SoftwareDevelopmentCost
EstimationApproachesASurvey,USCCSE00505,2000
[Boehmetal.2004] BoehmB,BhutaJ,GarlanD,GradmanE,HuangL,LamA,
MadachyR,MedvidovicN,MeyerK,MeyersS,PerezG,
ReinholtzKL,RoshandelR,RouquetteN,UsingEmpirical
TestbedstoAccelerateTechnologyMaturityandTransition:The
SCRoverExperience,Proceedingsofthe2004International
SymposiumonEmpiricalSoftwareEngineering,IEEEComputer
Society,2004
[BoehmLee2005] BoehmBandLeeK,EmpiricalResultsfromanExperimenton
ValueBasedReview(VBR)Processes,Proceedings,ISESE2005,
September2005
[Boehm2009] BoehmB,ApplyingtheIncrementalCommitmentModelto
BrownfieldSystemDevelopment,Proceedings,CSER2009.
[BoehmLane] Boehm,B.W.andLane,J.A,UsingtheIncrementalCommitment
ModeltoIntegrateSystemAcquisition,SystemsEngineeringand
SoftwareEngineering,TechReport2007715,Universityof
SouthernCalifornia,2007.
Appendices129
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
[BoehmLane2010] BoehmBandLaneJ,DoDSystemsEngineeringand
ManagementImplicationsforEvolutionaryAcquisitionofMajor
DefenseSystems,Proceedings,CSER2010.
[BisignaniReed1988] BisignaniM.,andReedT,SoftwareSecurityCostingIssues,
COCOMOUsersGroupMeeting..LosAngeles:USCCenterfor
SoftwareEngineering,1988.
[Booch2009] PersonalcommunicationfromGradyBooch,IBM,2009
[Broy2010] BroyM,SeamlessMethodandModelbasedSoftwareand
SystemsEngineering,TheFutureofSoftwareEngineering,
Springer,2010.
[Cohn2005] CohnM,AgileEstimatingandPlanning,PrenticeHall,2005
[ColbertBoehm2008] ColbertEandBoehmB,CostEstimationforSecureSoftware&
Systems,ISPA/SCEA2008JointInternationalConference,June
2008
[CSSE2000] GuidelinesforModelBased(System)ArchitectingandSoftware
Engineering(MBASE),availableat
http://sunset.usc.edu/classes_cs577b,2000.
[DCARC2005] DefenseCostandResourceCenter,TheDoDSoftwareResource
DataReportAnUpdate,ProceedingsofthePracticalSoftware
Measurement(PSM)UsersGroupConference,19July2005
[Galorath2005] GalorathInc.,SEERSEMUserManual,2005
[Brooks1995] Brooks,F.,TheMythicalManMonth,AddisonWesley,1995.
[DSRC2005] DefenseCostandResourceCenter,TheDoDSoftwareResource
DataReportAnUpdate,ProceedingsofthePracticalSoftware
Measurement(PSM)UsersGroupConference,19July2005.
[Elssamadisy ElssamadisyA.andSchalliolG.,RecognizingandResponding
Schalliol2002] toBadSmellsinExtremeProgramming,Proceedings,ICSE2002,
pp.617622
[GalorathEvans2006] GalorathD,EvansM,SoftwareSizing,Estimation,andRisk
Management,AuerbachPublications,2006
[GarmusHeron2000] Garmus,DavidandDavidHerron.FunctionPointAnalysis:
MeasurementPracticesforSuccessfulSoftwareProjects.Boston,
Mass.:AddisonWesley,2000
[GillIansiti1994] GillG.,andIansitiM.,MicrosoftCorporation:OfficeBusiness
Unit,HarvardBusinessSchoolCaseStudy691033,1994.
Appendices130
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
[Goethertetal1992] GoethertW,BaileyE,andBusbyM,SoftwareEffortand
ScheduleMeasurement:AFrameworkforCountingStaffHours
andReportingScheduleInformation.SoftwareEngineering
Institute,CarnegieMellonUniversity,ESCTR92021,1992.
[HopkinsJenkins2008] HopkinsR,andJenkinsK,EatingtheITElephant:Movingfrom
GreenfieldDevelopmenttoBrownfield,IBMPress.
[IFPUG1994] FunctionPointCountingPractices:ManualRelease4.0,
InternationalFunctionPointUsersGroup,BlendonviewOffice
Park,500828PineCreekDrive,Westerville,OH430814899.
[IFPUG2009] InternationalFunctionPointsUsersGroup,http://www.ifpug.org
[ISO12207] ISO/IEC12207,InternationalStandardonInformationTechnology
SoftwareLifecycleProcesses,InternationalOrganizationfor
Standardization(ISO),1995.
[ISO1999] ISOJTC1/SC27,EvaluationCriteriaforITSecurity,inPart1:
Introductionandgeneralmodel,InternationalOrganizationfor
Standardization(ISO),1999.
[Jensen1983] JensenR,AnImprovedMacrolevelSoftwareDevelopment
ResourceEstimationModel,Proceedingsof5thISPAConference,
1983
[Koolmanojwong KoolmanojwongSandBoehmB,TheIncremental
Boehm2010] CommitmentModelProcessPatternsforRapidFielding
Projects,Proceedings,ICSP2010,Paderborn,Germany.
[Kruchten1998] Kruchten,P.,TheRationalUnifiedProcess,AddisonWesley,1998.
[Lane2009] LaneJ.,CostModelExtensionstoSupportSystemsEngineering
CostEstimationforComplexSystemsandSystemsofSystems,
7thAnnualConferenceonSystemsEngineeringResearch2009
(CSER2009)
[LaneBoehm2007] LaneJ.,andBoehmB.,ModernToolstoSupportDoDSoftware
IntensiveSystemofSystemsCostEstimationADACSStateof
theArtReport,August2007.
[Lewisetal.2008] LewisGetal.,SMART:AnalyzingtheReusePotentialofLegacy
ComponentsonaServiceOrientedArchitectureEnvironment,
CMU/SEI2008TN008.
[Lietal.2009] LiQ,LiM,YangY,WangQ,TanT,BoehmB,HuC,Bridgethe
GapbetweenSoftwareTestProcessandBusinessValue:ACase
Study.Proceedings,ICSP2009,pp.212223.
Appendices131
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
[Lumetal.2001] LumK,PowellJ,HihnJ,ValidationofSpacecraftSoftwareCost
EstimationModelsforFlightandGroundSystems,JPLReport,
2001
[Madachy1997] MadachyR,HeuristicRiskAssessmentUsingCostFactors,IEEE
Software,May1997
[MadachyBoehm2006] MadachyR,BoehmB,AModelofOptionsandCostsforReliable
Autonomy(MOCA)FinalReport,reportedsubmittedtoNASA
forUSRAcontract#4481,2006
[MadachyBoehm2008] MadachyR,BoehmB,ComparativeAnalysisofCOCOMOII,
SEERSEMandTrueSSoftwareCostModels,USCCSSE2008
816,2008
[NCCAAFCAA2008] NavalCenterforCostAnalysisandAirForceCostAnalysis
Agency,SoftwareDevelopmentCostEstimationHandbook
Volume1(Draft),SoftwareTechnologySupportCenter,
September2008
[Nguyen2010] NguyenV.ImprovedSizeandEffortEstimationModelsfor
SoftwareMaintenance,PhDDissertation,Departmentof
ComputerScience,UniversityofSouthernCalifornia,December
2010,
http://csse.usc.edu/csse/TECHRPTS/by_author.html#Nguyen
[Park1988] ParkR,TheCentralEquationsofthePRICESoftwareCost
Model,COCOMOUsersGroupMeeting,1988
[Park1992] ParkR,SoftwareSizeMeasurement:AFrameworkforCounting
SourceStatements,SoftwareEngineeringInstitue,Carnegie
MellonUniversity,ESCTR92020,1992
[Putnam1978] Putnam,L.H.,ExampleofanEarlySizing,CostandSchedule
EstimateforanApplicationSoftwareSystem,Computer
SoftwareandApplicationsConference(COMPSAC),1978
[PutnamMyers1992] Putnam,L.H.,andW.Myers.MeasuresforExcellenceReliable
SoftwareonTime,WithinBudget.PrenticeHall,Inc.Englewood
Cliffs,NJ,1992
[PutnamMyers2003] Putnam,L.H.,andW.Myers.FiveCoreMetrics.DorsetHouse
Publishing.NewYork,NY,2003
[PRICE2005] PRICESystems,TRUESUserManual,2005
[QSM2003] QuantitativeSoftwareManagement,SLIMEstimateforWindows
UsersGuide,2003
Appendices132
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease
UNCLASSIFIED
SoftwareCostEstimationMetricsManual
[Reiferetal.1999] ReiferD,BoehmB,ChulaniS,TheRosettaStoneMaking
COCOMO81EstimatesWorkwithCOCOMOII,Crosstalk,1999
[Reifer2008] Reifer,D.,TwelveMythsofMaintenance,ReiferConsultants,
Inc.,2008.
[Reifer2002] Reifer,D.,LettheNumbersDotheTalking,CrossTalk,Vol.15,
No.3,March2002.
[Reifer2002] ReiferD.,Security:ARatingConceptforCOCOMOII.2002.
ReiferConsultants,Inc.
[Royce1998] Royce,W.,SoftwareProjectManagement:AUnifiedFramework,
AddisonWesley,1998.
[Schwaber Schwaber,K.andBeedle,M.,Scrum:AgileSoftware
Beedle2002] Development,PrenticeHall,2002.
[Selby1988] SelbyR,EmpiricallyAnalyzingSoftwareReuseinaProduction
Environment,InSoftwareReuse:EmergingTechnology,W.
Tracz(Ed.),IEEEComputerSocietyPress,1988
[Stutzke2005] Stutzke,RichardD,EstimatingSoftwareIntensiveSystems,Upper
SaddleRiver,N.J.:AddisonWesley,2005
[Tanetal.2009] Tan,T,LiQ,BoehmB,YangY,HeM,andMoazeniR,
ProductivityTrendsinIncrementalandIterativeSoftware
Development,Proceedings,ACMIEEEESEM2009.
[Tan2012] Tan,T,DomainBasedEffortDistributionModelforSoftware
CostEstimation,PhDDissertation,ComputerScience
Department,UniversityofSouthernCalifornia,June2012.
[USC2006] UniversityofSouthernCaliforniaCenterforSoftware
Engineering,ModelComparisonReport,ReporttoNASAAMES,
DraftVersion,July2006
[USD(AT&L)2008] SystemsEngineeringGuideforSystemofSystems,Version1.0,
OUSD(AT&L),June2008.
[Valerdi2011] ValerdiR,SystemsEngineeringCostEstimationwithCOSYSMO,
Wiley,2011.
[Yangetal.2005] YangY,BhutaJ,BoehmB,andPortD,ValueBasedProcessesfor
COTSBasedApplications,IEEESoftware,Volume22,Issue4,
JulyAugust2005,pp.5462
Appendices133
UNCLASSIFIED
DistributionStatementA:ApprovedforPublicRelease