Having worked for a CMM 5 Org. which is known in the industry for its processes and quality metrics, this effort is to collate Metrics which can help us to scrutinise projects with right perspective.
Defect Metrics
Defect Removal Efficiency (DRE)
DRE is a measure to detect defects before delivery.It is calculated as a percentage of the defects identified and corrected internally with respect to the total defects in the complete project life cycle. Thus, DRE is the percentage of bugs elimiated by reviews, inspections, tests etc.
Here is an interesting Defect Removal Efficiency calculator by Harvey Roy Divinagracia:
http://www.engin.umd.umich.edu/CIS/course.des/cis525/java/f00/harvey/DRE_Calc.htmDefect Removal Efficiency where - E1 - Defects found before IT & E2 - Total Defects till date
Project/Activity: XXX DRE = 0.32070708 (32.07071 %) where E1 = 127 , E2 = 269.
Industry Standard - 85%According to Software Productivity Research chairman Capers Jones’ paper, “Software Defect-Removal Efficiency” (IEEE Computer, April 1996), software products can be considered “high quality” if the defect removal efficiency reaches 95 percent. The defect removal efficiency is a quality metric calculated by dividing the development defects (defects reported prior to product release) by the total number of defects (the sum of the development defects plus the defects reported during the first year of product use). In other words, for best-in-class companies, the investment in defect removal yields 95 percent of the defects prior to product release—and according to Jones, the industry standard of 85 percent just isn’t good enough.
Defects\KLOCNumber of defects found per 1000 Lines of Code.
With a Japanese client that i worked they expected 0.25defects \KLOC i.e 1 defect\KLOC.
Here's an interesting blog post that cites some figures from books:
http://amartester.blogspot.com/2007/04/bugs-per-lines-of-code.htmlThe book "Code Complete" by Steve McConnell has a brief section about error expectations. He basically says that the range of possibilities can be as follows:
(a) Industry Average: "about 15 - 50 errors per 1000 lines of delivered code." He further says this is usually representative of code that has some level of structured programming behind it, but probably includes a mix of coding techniques.
(b) Microsoft Applications: "about 10 - 20 defects per 1000 lines of code during in-house testing, and 0.5 defect per KLOC (KLOC IS CALLED AS 1000 lines of code) in released product (Moore 1992)." He attributes this to a combination of code-reading techniques and independent testing (discussed further in another chapter of his book).
(c) "Harlan Mills pioneered 'cleanroom development', a technique that has been able to achieve rates as low as 3 defects per 1000 lines of code during in-house testing and 0.1 defect per 1000 lines of code in released product (Cobb and Mills 1990). A few projects - for example, the space-shuttle software - have achieved a level of 0 defects in 500,000 lines of code using a system of format development methods, peer reviews, and statistical testing."
The NASA has a defect density of 0.004 bugs/KLOC (vs 5.00 bugs/KLOC for the Industry) but this has a cost of $850.0/LOC (vs $5.0/LOC for the Industry). Source:
agileindia.org/agilecoimbatore07/presentations/… –
Defect Density - Defects\KLOC is a flavor DD
Defect-Density (DD) Measures the number of defects in a perticular size of code. it is measured as follows:
DD = Defects/Size of the Project
Size of Project can be Function Points, Feature Points, Use Cases, KLOC etc
Defect Age :
Defect Age (in Time) is the difference in time between the date a defect is detected and the current date (if the defect is still open) or the date the defect was fixed (if the defect is already fixed).
Defect Age (in Phases) is the difference in phases between the defect injection phase and the defect detection phase.
This article explains Defect age in Time and by Phase. I have used with Phase and helps to identify the problem area. Though most of time we are tempted to conclude Phase Injected is Coding. Deeper probing reveals its the requirement phase which should have been the injection phase.
The latter phase they are identified the higher the efforts to fix the defects and more chances of the 3 sides of traingle impacted, Cost, Time and schedule.
Number the phases from Requirements as 1 to Acceptance to 8, the lower the number the better the project health.
Average Defect Age = (Activity Detected - Activity Introduced)/Number of Defects
Effort Metrics :
What was the accuracy of estimating the actual value of project effort?
Effort Estimation Accuracy (EEA) - Motorola uses this
