The control of complexity control presents the core problem of software development. The huge variety of decisions a developer faces on a day-to-day basis cry for methods of controlling and containing complexity.
Complexity poses a primary obstacle for developers. Our human neurology has a limit to complexity wired into each of us. The basics of Clean Code techniques can ease understanding and productivity by decreasing what we must keep in our working memory.
One of the most highly cited papers in psychology, “The Magical Number Seven, Plus or Minus Two: Some limits on Our Capacity for Processing Information” was published in 1956. This paper reports 7 +/- 2 as the number of objects the average human can hold in working memory. This correlates to about three bits of information: 2**3 = 8 as a maximum performance level of the average human. Continue reading
“We are uncovering better ways of developing software by doing it and helping others do it.”
Duplicated code must surely rank near the top for bad code. The problem is that a new feature or bug will surely cause changes to one clone. Then the other clones will be silently ignored. This will likely create incorrect behavior. The tests for the clones likely will not catch these bugs either. Continue reading
How much does duplicated code really cost a project?
I’ve tried to find studies detailing the cost of duplicate code within projects. We can all spout the problems these clones produce, but can we quantify? Apparently not.
One pundit suggests the if our code base has 20% were duplicated, then eliminating it would reduce maintenance costs by 20%. Continue reading
A major learning for my recent career has been “Clean Code”. This movement was started roughly a dozen years ago and has been building since.