Lecture - Week - 3 - Software Metrics
Lecture - Week - 3 - Software Metrics
CS3003
Lecture 3: Software Metrics
Lecture schedule
Week Lecture Topic Lecturer Week Commencing
1 Introducing the module and software Steve 28th Sept.
engineering
2 Software maintenance and evolution Steve 5th Oct.
3 Software metrics Steve 12th Oct.
4 Test-driven development Giuseppe 19th Oct.
5 Software structure, refactoring and Steve 26th Oct.
code smells
6 Software Complexity Steve 2nd Nov.
Coursework released Tuesday 3rd
November
7 ASK week N/A 9h Nov
8 Software fault-proneness Steve 16th Nov.
9 Clean code Steve 23th Nov.
10 Human factors in software engineering Giuseppe 30th Nov.
11 SE Techniques applied in action Steve 7th Dec.
12 Guest industry lecture (tba) Guest Lecture 14th Dec.
Coursework hand-in Monday 14th
December
Lab/seminar schedule
Week Seminar Labs Week Commencing
1 No seminar No lab 28th Sept.
2 Seminar Lab (Introduction) 5th Oct.
3 Seminar Lab 12th Oct.
4 Seminar Lab 19th Oct.
5 Seminar Lab 26th Oct.
6 Coursework Brief Seminar No lab 2nd Nov.
7 ASK week ASK week 9th Nov.
8 Seminar Lab 16th Nov.
9 Coursework technique Lab 23th Nov.
seminar
10 Seminar Lab 30th Nov.
11 No seminar Work on coursework (no lab.) 7th Dec.
12 No seminar Work on coursework (no lab.) 14th Dec.
Norman Fenton
4
Structure of this lecture
5
Uses of measurement
6
How can software size be measured?
Why is size important?
Related to effort and cost
LOC a common measure of size
…but not very useful?
What about comments, blank space, “}”?
Many companies measure functionality rather
than code length
7
How can software structure be measured?
Information flow within the system
Indicator of maintainability and coupling
Identifies critical stress parts of the system and design
problems
Based on:
Fan-in: number of modules calling a module
Fan-out: number of modules called by a module
8
Ref: Marchese, PACE University
9
Henry & Kafura’s Complexity Metric:
Complexity of X = 10 * (3 * 2) 2
= 360
10
A complexity measure
McCabe’s Cyclometric Complexity measure
Commonly used by industry
In lots of tools
Any good?
Based on control flow graph
Very useful for identifying white box test cases
Attributed to Tom McCabe who worked for IBM in
the 1970’s
11
Cyclomatic Complexity
Program P
CC(P) =
#edges - #nodes +2
#edges = 11
#nodes = 9
CC = 4
12
Vergilio et al 2006
13
Complexity Metrics
15
M has DIT equal to 0
16
C&K metrics (cont.)
Coupling between objects (CBO)
Number of other classes coupled to a class
Lack of cohesion of methods (LCOM)
Number of attributes accessed by more than one method
17
CBO
18
LCOM Example
http://www.tusharma.in/technical/revisiting-lcom/
19
Thresholds
Metric thresholds:
What is the best number of methods in a class?
What is the best number of attributes in a class?
What is the best number of LOC in a method?
What is the best number of LOC in a class?
What would you consider the optimal level in
each case?
Answer?
20
What is important about measures?
Direct measures
Measures that can have numbers directly attributed to them
Examples include:
Length of source code (measured in LOC)
Effort of programmers on a project
Indirect measures
Measures that cannot have numbers directly attributed and
need other measures to be calculated to make sense
Examples include:
Fault rate per week = number of faults in a week / 5
Assuming a 5 day week
Area of room example
21
Problems with metrics in the real-
world
There is a tendency for professionals to display
over-optimism and over-confidence in metrics
Metrics may cause more harm then good
Data is shown because its easy to gather and display
Metrics may have a negative effect on developer
productivity and well-being
What are other practical problems with collecting
metrics?
22
Software Metric Usage
Use common sense and organizational sensitivity when
interpreting metrics data
Provide regular feedback to the individuals and teams
who have worked to collect measures and metrics.
Don’t use metrics to appraise individuals
Never use metrics to threaten individuals or teams.
Metrics data that indicate a problem area should not be
considered “negative”.
These data are merely an indicator for process improvement
1-23
NASA’s use of metrics
24
Cone of uncertainty (McConnell)
25
What does the cone show?
At the beginning of a software project:
Estimates are subject to large uncertainty.
uncertainty decreases
Eventually, the uncertainty reaches 0% (at the project
end)
26
How metrics can help us make decisions (an
“audit grid”)
High business value
Business High business value
Low quality
quality High quality
9
10 8
6
7
2 5
1 3 4
System quality
27
Test-based metrics
How many test cases have been designed
per requirement?
How many test cases have still to be
designed?
How many test cases have been executed?
How many test cases passed or failed?
How many bugs were identified and what
were their severities?
28
Reading
29