I was reading this post about managed code analysis and code complexity over at the Vertigo Software Team System blog. Down the bottom of the post they picked up on an issue which frustraites me as well. That is the hard coding of trigger points for code analysis warnings and errors against the cyclomatic complexity count.

Personally I want anything over ten to trigger a warning and anything over twenty to throw an error. My opinion is that with the frameworks we have today and the refactoring tools available to us there should be no excuse for writing a piece of code that is too hard to get a handle on, and that _starts_ happening once you get over a CCC of ten.

The solution is that in the next version of VSTS needs to make rules configurable (in the same way check-in policies are). Basically – if the rule implements a specific interface we should be able to right mouse click on it in the configuration screen and bring up a custom configuration dialog.

ConfigureFxCopRule

In this case the UI could have a couple of trackbars which map to a fragment of XML which is stored in the MSBuild file and passed to the rule when it is invoked. The exclusion system could be tweaked a bit as well to allow complexity up to a certain point but not beyond that.

The scenario that I am thinking of is when you do a code review after the above has triggered a warning and you decide that the code really can’t be improved by rather than just putting in a blanket exclusion you say that you will accept complexity up to its current level, but if it goes beyond that you want to be warned again. To make this possible an XML fragment would need to passed into the constructor of the exclusion attribute.

At any rate – the current defaults are WAY WAY WAY too high.