February 26, 2014 Code Quality 0 Comments

How to know when code is complex, Part 2: McCabe Complexity Metric

In my last post I discussed code complexity and the advantages of breaking complex code into smaller blocks. But how can we judge the size of these blocks?  How can we quickly know the number of decisions in each block?  If we think in terms of the delicate balance in a Calder Mobile, how do we ensure the blocks are similar in size and we get the “balance” that we need?

[fusion_builder_container hundred_percent=”yes” overflow=”visible”][fusion_builder_row][fusion_builder_column type=”1_1″ background_position=”left top” background_color=”” border_size=”” border_color=”” border_style=”solid” spacing=”yes” background_image=”” background_repeat=”no-repeat” padding=”” margin_top=”0px” margin_bottom=”0px” class=”” id=”” animation_type=”” animation_speed=”0.3″ animation_direction=”left” hide_on_mobile=”no” center_content=”no” min_height=”none”] Source: Cyclomatic Complexity

This is where the McCabe Complexity Metric comes in. The Metric was laid out in an article by Thomas McCabe in December of 1976.  It relates to the number of decision points (points where the logic path splits) in the section of code.  So, for example, if we had 4 statements which moved data from one field to another and there are no decisions it, that section would get a score of 1.  If we put in one decision then there would be two paths and it would be scored as 2.  If one of the branches has a decision then there would be three paths, it would get a score of 3, and so on. This continues until you end up with, at times, pretty big numbers. Generally it’s best to keep each section of code at 10 or less. Ten is about the number you can keep in your brain – any more and the theory is you will get lost.

The McCabe Metric will help you find hidden knots of logic in your programs. We’ve all experienced them: areas in code that seem to be needlessly complex. They tend to be prone to error so developers will spend their most time in these area with analysis and debugging. To quickly spot the areas of greatest complexity, you may want to establish a threshold of 10, or a slightly higher number such as 15, and critically examine anything that rises above it. In the programs I have reviewed, most of a program is usually under 10, but a few areas are higher, and sometimes much higher.

When used along with the Halstead Metric, the McCabe Metric can help you objectively assess and compare the complexity of new programs and applications. By using a threshold you can focus on the areas of greatest complexity which can assist you in breaking them into smaller, more manageable, logical sections.

I’ll discuss some practical uses of the McCabe Complexity Metric in my next post.[/fusion_builder_column][/fusion_builder_row][/fusion_builder_container]

The following two tabs change content below.

Mark Schettenhelm

Mark Schettenhelm, Product Manager for Compuware’s ISPW product, has more than 30 years of experience in the IT industry. Mark’s background is in Testing and Application Portfolio Analysis with his current focus being in Source Code Management. Mark has received the Certification of Information Privacy Professional (CIPP/US) from the International Association of Privacy Professionals, (IAPP). He has been interviewed in numerous publications including ZDNet UK, Application Development Trends and Storage Magazine, and has a regular column in Enterprise Tech Journal. In his 25 years with Compuware, Mark has been instrumental in helping to bring the Compuware mainframe solutions to the forefront of the industry. Mark’s efforts on matching customer needs and expectations with the functionality of various software and solutions have been extremely successful in helping the Company chart a path for the enterprise segment of the IT industry. Before joining Compuware Mark was a software developer in the health care industry. Mark is a native of Michigan and graduated from Northern Michigan University with a B.S. in History.