You are on page 1of 6

Otto Vinter

Software Engineering Mentor

Analysing Your Defect Data for Improvement Potential

Otto Vinter
Software Engineering Mentor, DK-2630 Taastrup, Denmark Email: vinter@inet.uni2.dk www.vinter.suite.dk

Introduction
Why is it that we seem to make the same mistakes over and over again? It cant be that we dont want to know whythat we dont want to find ways to prevent our mistakes, and ultimately to improve the quality of our products. Surely we do. Analysis of defect data from previous projects, for example, tells us about our most frequent errors, and can help us improve. But very few companies take the time to analyse their experience data and learn from them. In this paper I will explain how you can organise and perform an analysis of your companys defects in a cost-effective way. The defect analysis process presented uses problem reports from past projects to help experienced process consultants identify specific, frequently occurring problems. Defect analysis puts you in a better position to determine how problems might be prevented, and thus serves to implement focused improvement actions. I have practical evidence from a number of real-life projects that defect analysis is a simple and effective approach that can solve major problems in a companys software development process and give real benefits to the company.

Defect Analysis Models


Basically, there are two different models you can apply to defect analysis: You can pursue defect analysis by means of quantitative mathematical models, or you can do it by qualitative means.

Otto Vinter Sthensvej 2F DK-2630 Taastrup

Email: vinter@inet.uni2.dk Web site: www.vinter.suite.dk Skype name: otto.vinter

Tel/Fax: +45 4399 2662 Mobile Tel: +45 4045 0771 CVR nr (VAT): DK 17 58 96 44

Otto Vinter
Software Engineering Mentor

The quantitative, mathematical approach The advantages of a quantitative approach are that you can perform: quantitative analysis on the data statistical analysis and regression growth curve modelling comparison to historical data However, a quantitative approach is not easily translatable to corrective action, which is what you ultimately want from an analysis of defect. It is not enough just to know that things look a certain way (usually bad), you must to be able to propose improvement actions to your management. Furthermore mathematical models often fail to predict because: low-level maturity processes are the norm in companies software development is not a production process personal and team issues dominate The qualitative, causal approach The advantages of a qualitative approach are that you can: perform causal analysis (root cause) investigate details of a sample of defects have flexibility on what will be analysed translate the results easily to specific improvement actions Furthermore a causal analysis often works because it: focuses on prevention addresses individual and team issues enables feed-back, education, and motivation promotes a needs based, bottom-up process improvement However, a qualitative approach is time consuming and may be difficult to aggregate across different types of teams and projects. Because time is money, management often turns away from this approach or puts severe restrictions the use of people and time.

Defect Classifications
Many different defect classifications exist and with widely varying scopes. But for the most part they are simple ad-hoc check lists developed internally by the process or quality group in the company and therefore of little use for large scale defect analyses. There are, however, two prominent defect classifications, which also represent two different purposes of defect analysis: You can either perform defect analysis on-the-fly during the execution of a project, with the purpose of changing the course of the current project; or as a retrospective activity when a project has completed, with the purpose of changing the teams development processes, before the team embarks on the next project.

Otto Vinter Sthensvej 2F DK-2630 Taastrup

Email: vinter@inet.uni2.dk Web site: www.vinter.suite.dk Skype name: otto.vinter

Tel/Fax: +45 4399 2662 Mobile Tel: +45 4045 0771 CVR nr (VAT): DK 17 58 96 44

Otto Vinter
Software Engineering Mentor

Orthogonal Defect Classification - ODC ODC was originally developed by Ram Chillarege et al. at IBM. It has the following characteristics: Tracks projects on-the-fly Separates classification and analysis (performed by different people) Fixed and limited taxonomy No qualitative information is collected during classification You can find more information on ODC from the following www.research.ibm.com/softeng/ODC/ODC.HTM, and www.chillarege.com/odc. Boris Beizers Bug Taxonomy Boris Beizer originally developed a taxonomy based on a large amount of defects from many sources. He calls them bugs, and therefore I use the same term when I use the taxonomy. I have integrated his taxonomy in a defect analysis process, which has the following characteristics: Retrospective defect analysis Combined classification and analysis (performed by the same people) Extensive and expandable taxonomy Qualitative information can be collected during classification You can find the taxonomy and Boris Beizers statistics in his book1 or find my amended version of the taxonomy and both his and my statistics at: www.vinter.suite.dk/bugtaxst.doc. web sites:

The Proposed Defect Analysis Approach


I propose that you perform defect analysis as a retrospective activity either on a regular basis at the completion of projects, or as a stand-alone assessment of the development process. In the latter case, retrospective defect analysis can also be used to measure the change between before and after improvements are introduced. Used systematically by a company, it can turn into a model for experience-driven incremental software process improvement.8 Furthermore, I propose that you perform defect analysis based on the qualitative approach, because it directly pinpoints which improvement actions should be suggested to management. However, you should always include whatever quantitative data are available in the organisation.

Retrospective Defect Analysis Steps


A retrospective defect analysis is a cooperative effort between consultants with deep knowledge about process issues, and developers who possess inside knowledge about the issues raised in the problem reports.

Otto Vinter Sthensvej 2F DK-2630 Taastrup

Email: vinter@inet.uni2.dk Web site: www.vinter.suite.dk Skype name: otto.vinter

Tel/Fax: +45 4399 2662 Mobile Tel: +45 4045 0771 CVR nr (VAT): DK 17 58 96 44

Otto Vinter
Software Engineering Mentor

The defect analysis process is based on interviews with developers about a number of closed problem reports that they have handled. Without this developer participation, it is not possible to reach correct conclusions about the root causes of the problems. The steps in the proposed retrospective defect analysis process are as follows: First the problem reports are analysed quantitatively to determine general trends and patterns. The following types of information usually give important insight into the overall problems of the project/organisation and are very useful in presentations for management: When was the defect found, e.g. the distribution of defects over time (before/after release) Where was defect found, e.g. the distribution of defects across system parts (subsystems/modules) Who found the defect, e.g. the distribution of defects across stakeholder groups (project/specialists/customers) The when can be used to discover how defect detection evolves over time and what common patterns can be derived from this. The defect detection percentage (DDP) should always be calculated. The DDP average for US companies is reported2 to be 85%, measured at the first quarter after release, but this must be regarded as the lower limit for professional development organisations in the future. The where can be used to determine whether there are particular areas in the product, which account for a disproportionate amount of defects. Special improvement actions should be initiated to take care of such cases, e.g. redesign and redevelopment. The who can be used to determine whether defects are found in the project group, internally in the organisation, or whether a disproportionate amount of defects are found by customers. Combinations of these three quantitative measurements can often disclose interesting information about the organisations development process and provoke immediate management action. The next step is to perform a qualitative analysis. The problem reports are sorted according to the specific developer related to them (typically, the one who has fixed the problem). An experienced process consultant subsequently, together with that developer, classifies and analyses those reports in an interview session. Most projects have numerous problem reports, so we only analyse an equally spaced sample of problem reports (such as every fourth report). In this way, we can maintain the correct distribution of issues over time. However, the sample should at least consist of 100 reports in order to maintain some statistical significance. For the classification of the individual problems, the taxonomy proposed by Boris Beizer is used. Beizers taxonomy splits bugs into nine main categories, which are further detailed in up to four levels. The process consultant must have studied Beizers taxonomy and know it well. Developers learn about the taxonomy as their problem reports are classified during the interview. Once everyone is familiar with the taxonomy, categorising a bug takes about five minutes.

Otto Vinter Sthensvej 2F DK-2630 Taastrup

Email: vinter@inet.uni2.dk Web site: www.vinter.suite.dk Skype name: otto.vinter

Tel/Fax: +45 4399 2662 Mobile Tel: +45 4045 0771 CVR nr (VAT): DK 17 58 96 44

Otto Vinter
Software Engineering Mentor

During the interviews, developers are initially inclined to blame most problems on either lack of precise requirements, or to blame themselves for a simple coding error. However, throughout the interviews the process consultant challenges these opinions, continually pressing the developers for more information, and offering other suggestions as to what might have caused the problem. This technique quickly triggers a deeper look, developers uncover more information, which typically changes the bug category, and identifies the real underlying cause. At the end of the classification of a bug, Beizers description of the selected category is reviewed to ensure, that the developer agrees that this is the most probable cause of the bug. In the final step of the defect analysis process, the developer is asked to speculate about what might have prevented the bug. The process consultant keeps asking for more specific clarifications in order to be able to define any general prevention techniques. The process consultant has initially prepared a short list of common prevention techniques, and whenever a new prevention technique is found, it is added to this list. The finally agreed, most probable prevention for the bug is appended to the problem report along with the classification. After all the interviews have been performed, the collected information is aggregated and conclusions are drawn. The most promising prevention techniques form the basis for the proposed improvement actions. All classifications, analyses, and proposals for improvements are finally presented to management.

Conclusion
From defect analysis management gains important insight into the most important problems in the companys development process and can initiate specific improvement actions based on the suggested prevention techniques. Furthermore, developers gain an increased awareness of the type of problems that result from their software development practices through the individual interviews with the experienced process consultants. The awareness leads to increased quality consciousness among the developers in the company, and this learning process is an additional benefit of the defect analysis. I have used the abovementioned defect analysis approach on real-life projects10,11 and I have seen that the improvements inspired by defect analysis lead to better products, both in terms of fewer defects after release, as well as higher customer satisfaction and sales. I have published these results at several international conferences5,6,7,12 as well as in journals.3,4,9 Though the results referenced in this paper only stem from my work at Brel & Kjr, I have also performed this type of defect analysis in a number of other companies and initiated several improvements based on the resulting analyses.

Otto Vinter Sthensvej 2F DK-2630 Taastrup

Email: vinter@inet.uni2.dk Web site: www.vinter.suite.dk Skype name: otto.vinter

Tel/Fax: +45 4399 2662 Mobile Tel: +45 4045 0771 CVR nr (VAT): DK 17 58 96 44

Otto Vinter
Software Engineering Mentor

Biography
Otto Vinter is an independent consultant (www.vinter.suite.dk). Until 2007, he was consulting for DELTA. Before 2000, he was responsible for software process improvements at Brel & Kjr. He has been the driving force in the company's improvement activities in testing, requirements engineering, and agile development models. He introduced defect analysis as an improvement technique and participated in all of the analyses. He has managed process improvements for 15+ years and development projects for 30 years.

1 2 3

9 10

11

12

Beizer B. (1990): Software Testing Techniques. Second Edition, Van Nostrand Reinhold New York (now: Coriolis Press). Jones C. (2008): Measuring Defect Potentials and Defect Removal Efficiency, CrossTalk, June 2008. (www.stsc.hill.af.mil/crosstalk/2008/06/0806Jones.html) Lauesen S., O.Vinter (2000): Preventing Requirement Defects, Proceedings of the Sixth International Workshop on Requirements Engineering: Foundations of Software Quality, REFSQ 2000. (www.itu.dk/people/slauesen/Papers/PrevDefectsProceedings.pdf) Lauesen S., O.Vinter (2001): Preventing Requirement Defects: An Experiment in Process Improvement, Requirements Engineering Journal, Vol. 6 No. 1, Springer-Verlag London 2001. (www.itu.dk/people/slauesen/Papers/PrevDefectsREJ.pdf or: www.vinter.suite.dk/PrevDefectsREJ.pdf) Vinter O. (1996): Experience-Driven Process Improvement Boosts Software Quality, Proceedings of the Ninth International Software Quality Week, Software Research, San Francisco, USA. (www.vinter.suite.dk/SQWpap96.doc) Vinter O. (1997): How to Apply Static and Dynamic Analysis in Practice, Proceedings of the Tenth International Software Quality Week, Software Research, San Francisco, USA. (www.vinter.suite.dk/SQWpap97.doc) Vinter O. (1998): Improved Requirements Engineering Based on Defect Analysis, Proceedings of the Second International Software Quality Week Europe, Software Research, San Francisco, USA. (www.vinter.suite.dk/QWEpap98.doc) Vinter O. (2000): Experience-Based Approaches to Process Improvement, Proceedings of the Thirteenth International Software Quality Week, Software Research, San Francisco, USA. (www.vinter.suite.dk/sqw00v2.doc) Vinter O., S. Lauesen (2000): Analyzing Requirements Bugs. Software Testing & Quality Engineering Magazine, Vol. 2-6, Nov/Dec 2000. (www.vinter.suite.dk/stqema26.doc) Vinter O., S. Lauesen & J. Pries-Heje (1999): A Methodology for Preventing Requirements Issues from Becoming Defects. ESSI Project 21167. Final Report, Brel & Kjr Sound & Vibration Measurement A/S, DK-2850 Nrum, Denmark. (www.vinter.suite.dk/finalrp3.doc) Vinter O., P.-M. Poulsen, K. Nissen & J.M. Thomsen (1996): The Prevention of Errors through Experience-Driven Test Efforts. ESSI Project 10438. Final Report, Brel & Kjr A/S, DK-2850 Nrum, Denmark. (www.vinter.suite.dk/finalrep.doc) Vinter O., J. Pries-Heje (2000): Scenarios, Prototyping & Usability Tests Improve Products & Requirements, Proceedings of the Eighths European International Conference on Software Testing Analysis and Review, EuroSTAR 2000, EuroSTAR Conferences, Galway, Ireland. (www.vinter.suite.dk/es00pres.doc)

Otto Vinter Sthensvej 2F DK-2630 Taastrup

Email: vinter@inet.uni2.dk Web site: www.vinter.suite.dk Skype name: otto.vinter

Tel/Fax: +45 4399 2662 Mobile Tel: +45 4045 0771 CVR nr (VAT): DK 17 58 96 44

You might also like