NDepend In Review


Sunday 17 Dec 2017 at 21:00
Development  |  c# staticanalysis software review

Back in September of this year, I was contacted by Patrick Smacchia, the lead developer from the NDepend team.  If you're not familiar with NDepend, it's a static analysis tool for .NET.  Patrick had asked me if I would be interested in reviewing the software.  I don't normally write software reviews on my blog, but on this occasion, I was intrigued.  I'm a long time user of ReSharper in my day to day work and ReSharper offers some amount of static analysis of your code whilst you write it.  I'd also been watching certain open source projects that had appeared since the introduction of the Roslyn compiler technology such as RefactoringEssentials, Code Cracker & SonarAnalyzer.  I've been thinking a lot about static analysis of code recently, especially since I discovered Connascence and started using that as a way to reason about code.

I had previously heard of the NDepend tool, but had never used it, so when Patrick contacted me and asked if I'd like to review his static analysis tool, I was more than happy to do so.

Full Disclosure: NDepend is a commercial tool with a trial period.  Patrick agreed to provide me with a complimentary professional license for the software if I wrote a review of the software on my blog.  I agreed on the condition that the review would be my own, honest opinion of the software, good or bad.  What follows is that personal opinion.

This is a fairly long blog post so if you want the TL;DR skip to the end and the "Conclusion" section, but I suggest that you do read this whole post to better understand my complete experience using the NDepend product.

Getting Started

The first thing to getting up and running with NDepend was to download the software from the NDepend website.  This was easy enough, simply by following the relevant links from the NDepend homepage.  The version of NDepend that is current as I write this is v2017.3.2.  As I already had a license, I was able to enter my license key that had been previously supplied to me via email and begin the download.  NDepend comes as a .zip archive file rather than an MSI or other installer program.  This was somewhat unusual as a lot of software is delivered via a setup installer package these days, but I'm an old-school, command line kind of guy and I liked that the program was supplied as a .zip file that you simply extract into a folder and go.  After having done that, there's a couple of ways you can launch NDepend.  There's a few executable files in the extracted folder, so it's not immediately obvious what you execute first, but NDepend does have a helpful getting started guide on their website that's easy enough to follow along with.

Image

For usage on a development machine, you're likely to be using either the stand alone Visual NDepend IDE (VisualNDepend.exe) or, perhaps more likely, the Visual Studio plugin for interacting with NDepend (installed by running the NDepend.VisualStudioExtension.Installer.exe installer).  As well as being used on a developer's machine, NDepend can also run on your build machine, and this is most likely to be integrated into your build process via the NDepend command line tool (NDepend.Console.exe). 

Image

NDepend also ships with an executable called NDepend.PowerTools.exe and its source code in the NDepend.PowerTools.SourceCode folder.  This is a kind of optional extra utility which is run from the command line and contains a number of predefined metrics that can be run against some .NET code.  NDepend also provides an API with which we can integrate and use the NDepend functionality from our own code.  Many of the metrics of the NDepend Powertools are also used within the "main" NDepend tool itself, but as the source code is supplied to the Powertools utility, we can see exactly how those metrics are calculated and exactly how the various statistics around the code that NDepend is analysing is gathered from the NDepend API.  In essence, as well as providing some handy metrics of its own, the NDepend Powertools also serves as kind of demonstration code for how to use the NDepend API.

Launching NDepend For The First Time

After installing the Visual Studio plugin and launching Visual Studio, I loaded in a solution of source code that I wanted to analyse.  I opted to initially launch NDepend via the installed Visual Studio plugin as I figured that is the place I'd be interacting with it most.  NDepend installs itself as a extra menu option in the Visual Studio menu bar, much like other extensions such as ReSharper does.  When you first view this menu on a new solution, most of the menu options are greyed out.  Initially I was slightly confused by this but soon realised that as well as you having a Visual Studio solution and one or more project files for your own code, so too does NDepend have its own project files, and you need to create one of these first before you can perform any analysis of the current loaded solution.  NDepend helpfully allows this with one simple click on the "Attach New NDepend project to current VS Solution" menu item.  Once selected, you're asked which assemblies within your solution you wish to have analysed and then NDepend will go off and perform an initial analysis of your code.

NDepend uses both your solution's source code as well as the compiled assemblies to perform its analysis.  Most of the analysis is done against the compiled code and this allows NDepend to provide statistics and metrics of the IL (or Common Intermediate Language) that your code will compile to as well as statistics and metrics of other code areas.  NDepend uses the solution's source code during analysis in order to gather metrics relating to things not present inside the IL code such as code comments and also to gather statistics allowing NDepend to locate code elements such as types, methods and fields when these are shown as part of a query or search (see later for further details).

Image

Once the analysis is complete, you'll get a pop-up window informing you of the completion of the initial analysis and asking you what to do next.  You'll probably also get a new browser tab open up in your default browser containing the NDepend Analysis report.   It was at this point, I was initially quite confused.  Having multiple things pop-up after the analysis was complete was a little startling and quite overwhelming.  One thing I'd have rather seen is for the browser based report to not be loaded initially, but to have been a button to be clicked on ("View Analysis Report" on something similar) within the pop-up window.  This way, only one "pop-up" is appearing, which to me is a lot less jarring.

The analysis popup allows you to view the NDepend dashboard, the interactive graph or the NDepend Code rules.  I hadn't read the online documentation, so it's probably mostly my own fault, but I'm the kind of person who just likes to dive straight in to something and try to figure things out for myself, but at this point, I was quite stuck.  I didn't quite know what to do next.  The help text on the popup indicates that the Dashboard is the best place to start, and so I selected that and was greeted with the following screen:

Image

Now, this is an incredibly informative dashboard, with an awful lot of information contained within, but for me (and again, quite possibly my own fault for not reading the documentation more thoroughly) this was even more confusing.  There's a few things on the dashboard that seem to be obvious and make sense, such as the number of lines of code, the number of lines of comments and the percentage compared to actual code, but a lot of the other information around the debt level, quality gates and rules didn't seem to make much sense to me at all.  I decided I'd look at a few of the other options that were available in the popup, however that initial popup had now disappeared, so I had to go hunting through the NDepend menus.  Under the "Graph" menu, I found the "View Dependency Graph" option which seemed to coincide with one of the buttons I'd seen on the initial post-analysis pop-up.  Sure enough, opting to view the dependency graph showed me a nice UI of the various assemblies within my solution and how they were all related.

Image

The Dependency Graph is a very nice way to get an overview of your solution.  It should be familiar to developers who have used the Enterprise SKU of Visual Studio and have used the various Architecture features, specifically the Code Map.  What's great about NDepend's Dependency Graph is the ability to alter the box size for the assemblies based upon various metrics for the assembly.  By default, it's based upon lines of code for the assembly, but can be changed to Cyclomatic Complexity, In/Out Edges, two types of coupling and overall "ranking" to name but a few.  The thickness of the arrows can also be altered based upon various metrics of the code such as namespaces, types, methods etc.  This gives the Dependency Graph view a great amount of power in seeing an overview of you solution, and specifically to see where complexity and potential issues with the code may lie. 

An alternative view of the dependency data of your solution can be found with the dependency matrix view.  This shows assemblies by name duplicated across two axes and showing the assembly metrics where the two assemblies intersect.

Image

The assembly matrix view is very handy to use on larger solutions that may contain dozens of different assemblies as the Dependency Graph view of such a solution can be quite busy and difficult to follow due to being overloaded with information.  Another very useful visual representation of solution and project metrics is the Code Metrics View (aka the Treemap Metrics View).  This again shows all assemblies in the solution but this time as a series of coloured boxes.  Each box is further sub-divided into more coloured boxes of various sizes based upon the chosen metric.

Image

The entire collection of boxes can be viewed at assembly, namespace, method, type or even field level with the size of the main boxes being determined by a similar set of metrics as the Dependency Graph, albeit with more options to choose from for method level metrics.  The colour of the boxes can be determined based upon a similar set of metrics as the size and is perhaps most frequently used to determine code coverage of the chosen element.  All of which gives a huge combination of options by which to gain a 100-foot view of the overall solution.  What's most interesting here is that the entire view can be based upon a specific "filter" or "query" performed over the entire codebase.

A "filter" or "query" over the codebase?  Well, this is where something called CQLinq comes in, and it's the very heart of the NDepend product.

Understanding NDepend

It took me a while to overcome the initial confusion and feeling of being overwhelmed by NDepend when I first started using it, however, after some time, I started to understand how the product hangs together, and it all starts with the foundation of everything else that NDepend offers, be that dependency graphs, code metrics, quality rules or other features.  The foundation of all of the impressive metrics and data that you can get from NDepend is CQLinq.

CQLinq stands for Code Query Linq and is at the heart of NDepend's features and functionality and is, essentially, a domain-specific language for code analysis.  It's based upon Linq, which is the Language Integration Query functionality that's been part of the .NET Framework since version 3.5.  For any developer who has used Linq to query in-memory objects, you'll know just have powerful and flexible a tool Linq can be.  It's even more powerful when Linq is used with a Linq Provider that can target some collection of external data such as a database.   Linq allows us to perform incredibly powerful filters, aggregations, projections and more, all from either a SQL-like syntax, or a very intuitive fluent chaining syntax.  For example:

var totalOutstanding = Customers.Where(c => c.Type == 1).Sum(c => c.Balance);

shows a simple but powerful query allowing the developer to, in a single line of code, get the total sum of all balances for a given type of customer from an arbitrary sized collection of customers, based upon properties of the customer objects within that collection.  This is just one example of a simple Linq query, and far more sophisticated queries are possible with Linq without adding too much additional complexity to the code.  The data here, of course, is the collection of customers, but what if that data was your source code?   That's where CQLinq comes in.

CQLinq takes your source code and turns it into data, allowing you, the developer, to construct a Linq query that can filter, aggregate and select metrics and statistics based upon viewing your code as data.  Here's a fairly simple CQLinq query:

from a in Application.Assemblies 
where a.NbLinesOfCode >= 0 
orderby a.NbLinesOfCode descending 
select new { a, a.NbLinesOfCode }

Here we're saying, look in all the assemblies within the solution that have greater than 0 lines of code.  We order those assemblies by the highest count of lines of code descending then we select a projected anonymous type showing the assembly itself and the count of its lines of code.  The result is a list similar to the following:

MyApp.Admin         8554
MyApp.Core          6112
MyApp.Data          4232
MyApp.DataAccess    3095
MyApp.Services      2398

So, even with this simple query, we can see the expressive power of a simple CQLinq query and the aggregate data we can obtain from our source code.

Now, the real power of CQLinq lies in the custom objects and properties that it makes available for you to use.  In the query above we can see the first line is selecting an a from Application.AssembliesApplication is an object provided by NDepend, and Assemblies is a collection of objects associated with the Application.  Further in the query, we can see that the variable a, that represents an assembly, has a property called NbLinesOfCode.  This custom property is also provided by NDepend and allows us to know how many lines of actual code the assembly contains.

NDepend, and CQLinq, provide a large number of custom objects (NDepend calls them custom domains) and properties out of the box giving insight into all sorts of different metrics and statistics on the source code that has been analysed by NDepend.  Custom objects/domains such as Assemblies, Types, Methods, Namespaces and custom properties such as IsClass, DepthOfInheritance, CouldBePrivate, CyclomaticComplexity, NbLinesOfCode, MethodsCalled, MethodsCallingMe, to give only the very briefest of overviews, contain the real power of CQLinq, and ultimately, NDepend.

Once you come to appreciate that underneath the surface of all of the other features of NDepend - such as the large array of metrics shown on the NDepend Dashboard and the NDepend Analysis Report - is one or more CQLinq queries slicing and dicing the data of your source code, everything starts to become that bit more clear!  (Or at least, it did with me!)

Code Rules, Quality Gates & Metrics

Having understood that most of the power of NDepend stems from the data provided by CQLinq queries, we can revisit the various metrics on the Dashboard.  Here, we can see that amongst the most obvious metrics such as lines of code and the number of assemblies, types and namespaces etc. there are more interesting metrics regarding Quality Gates, Rules and Issues.

If we click on the numbers next to the Quality Gates that have either failed, warned or passed, NDepend opens up a Queries and Rules explorer window and a Queries and Rules Edit window.

Image

Within the Queries and Rules Explorer, we can see a treeview of a number of different rules, grouped together by category.  These are code quality rules that ship out of the box with NDepend, and include such categories as Code Smells, Architecture, Naming Conventions and many others.  The icons next to the various category groups show us whether our source code that has been analysed has failed (i.e. violates a rule), is considered a warning, or passes for each rule defined.  If we click on a rule category within the treeview, we see the individual rules from that category in the window to the right.  This shows us how much of our code has matched with the fail/warn/pass states.  Clicking on the individual rule name will cause the Queries and Rules Edit window to show the details for the chosen rule.  Note that, by default and for me at least, the Queries and Rules Explorer Window was docking along the bottom of the Visual Studio main window, whilst the Queries and Rules Edit Window was docking to the right of the main Visual Studio window (as a new tab along with my Solution Explorer).  It was initially confusing as I hadn't noticed one window being updated when interacting with the other window, but once you're aware of this, it becomes much easier to understand.  One other quirk of the Queries and Rules Explorer window is that there doesn't appear to be any way to search for a specific rule within the many rules contained in the various category groups and I found myself sometimes manually expanding each of the category groups in turn in order to find a rule I'd previously been examining.  It would be great if the ability to search within this window was introduced in a future version of NDepend.

The Queries and Rule Edit window is divided into a bottom section that shows the various parts of your source code (the assembly, type or method) grouped in a treeview that are matched by the chosen rule.  In the top section, we can see the rule name and a textual description of the rule itself.  For example, when looking at one of the Code Smells category rules, Avoid Types Too Big, we can see that the description states, "This rule matches types with more than 200 lines of code. Only lines of code in JustMyCode methods are taken account.  Types where NbLinesOfCode > 200 are extremely complex to develop and maintain.".

Hang on.  NbLinesOfCode?   We've seen that before.  Sure enough, clicking the "View Source Code" button at the top of the top section of the Queries and Rules Edit window changes the top section view into something else.  A CQLinq query!

Image

Note that if the Queries and Rules Edit window is docked to the right in its default position, the overall window can appear a little cramped.  For me, this made it a little unclear exactly what I was looking at until I undocked the window and expanded its size.

Each and every one of the more than 200 rules that ship out of the box with NDepend are based upon a CQLinq query.  What's more is that each and every one of these queries can be edited.  And you can create your own category groups and rules by defining your own CQLinq queries.  Think about that for a moment as this is really what NDepend is providing you with.  Lots and a lots of pre-defined query and examination power right out of the box, but mostly the ability to have a full-featured, fluent and highly intuitive Linq-To-Code (for want of a better expression) provider that you can make full use of to perform all manner of examinations on your own source code.

Here's the pre-defined CQLinq query for determining which methods are in need of refactoring:

warnif count > 0 from m in JustMyCode.Methods where 
  m.NbLinesOfCode > 30 ||           
  m.CyclomaticComplexity > 20 ||    
  m.ILCyclomaticComplexity > 50 ||  
  m.ILNestingDepth > 5 ||           
  m.NbParameters > 5 ||             
  m.NbVariables > 8 ||              
  m.NbOverloads > 6                 

select new { m, m.NbLinesOfCode, m.NbILInstructions, m.CyclomaticComplexity, 
             m.ILCyclomaticComplexity, m.ILNestingDepth, 
             m.NbParameters, m.NbVariables, m.NbOverloads }

As can be seen above, the query makes extensive use of the built-in computed properties, all of which are very intuitively named, to gather a list of methods throughout the solution, and specifically methods that are from "JustMyCode" meaning that NDepend helpfully filters out all methods that belong to external frameworks and libraries, that fall afoul of the rules of this query and therefore are prime candidates for refactoring.  And, of course, if you don't like that NDepend only considers methods with more than 30 lines of code to be in need of refactoring, you can always edit the query and change that value to any number you like that better fits you and your team.

When running NDepend from the Visual Studio plug-in you get the ability to click on any of the results from the currently selected query/rules in the Queries and Rules Edit window and be taken directly to the type or method detected by the rule.   This is a really powerful and easy way to navigate through your code base to find all of specific areas of code that may be in need of attention.  I did, however, notice that not all matched code elements can be easily navigated to (see below for more detail).

NDepend also has extensive tooltip windows that can pop-up giving all manner of additional information about whatever you've selected and is currently in context.  For example, hovering over results in the Query and Rules Edit window shows a large yellow tooltip containing even more information about the item you're hovering over:

Image

The "View Source Code" button within the Queries and Edit window acts as a toggle switch between viewing the source code of the CQLinq query and the textual description of the query.  The textual description often also includes text indicating "How To Fix" any of your code that violates the rule that the query relates to.  This can be a very handy starting point for identifying the first steps to take in refactoring and improving your code.

When editing CQLinq queries, the Queries and Edit window provides full intellisense whilst editing or writing query syntax, as well as even more helpful tooltip windows that give further help and documentation on the various CQLinq objects and properties used within the query.

NDepend allows individual rules to be designated as "Critical" rules and this means that any code found that falls afoul of a critical rule will result in a "failure" rather than a "warning".  These are the rules that you can define (and many are predefined for you, but you can always change them) which should not be violated and often such rules are included as part of NDepend's Quality Gates.

Quality Gates are the way that NDepend can determine an overall status of whether the code rules upon which the quality gate is based has passed or failed.  A Quality Gate is something that, like the code rules themselves, can used as-is out of the box, be modified from an existing one or be completely defined by yourself.  Quality gates can rely on one or more code rules in order to determine if the analysed source code is considered good enough to be released to production, so whereas the individual rules themselves will return a list of types, methods or other code artifacts that have matched the query, quality gates will return a pass, warn or fail status that can be used as part of an automated build process in order to fail the build, similar to how compilation errors will fail the compilation whereas warning won't.  Very powerful stuff.  Quality gates are defined using the same CQLinq syntax we've seen before:

// <QualityGate Name="Percentage Code Coverage" Unit="%" />
failif value < 70%
warnif value < 80%
codeBase.PercentageCoverage

Note that the first comment line actually defines this query as a quality gate and the first two lines of the query show the relative values from the query that should trigger the different return statuses of warning vs failure.

As well as code rules and quality gates, NDepend contains other metrics such as a SQALE Debt Ratio and other debt ratings.  The debt here is the technical debt that may exist within the code base.  Technical Debt is defined as the implied cost of rework/refactoring of parts of the solution in order to fix elements of code that are considered to be of lower quality.  The SQALE Ratio is a well defined, largely industry accepted mechanism for calculating the level of debt within a software project and is expressed as a percentage of the estimated technical-debt, compared to the estimated effort it would take to rewrite the code element from scratch.  NDepend uses the SQALE ratio to show a "Debt Rating", expressed as a letter from A (the least amount of debt) through to E (the most amount of debt), that is attached to the overall solution as well as individual elements of code such as types and methods and this debt rating can be seen on many of the tooltips such as the one shown when hovering over Query and Rules Edit window results as well as on the NDepend dashboard and Analysis Report.  There's a lot more metrics provided by NDepend including a very interesting Abstractness vs Instability metric that's shown on the front page of the Analysis Report.  Speaking of which...

Back to the beginning

So, having taken a diversion to understanding the core foundations of what makes NDepend tick, we can go back to that dashboard and analysis report that we saw at the beginning just after we created a new NDepend project and ran an analysis on our code for the very first time.

Image

The dashboard and report contain many of the same metrics but whilst the dashboard is a window inside Visual Studio (or the separate Visual NDepend IDE), the report is an entirely standalone HTML report and so can be sent to colleagues who may not have the requisite development tools on their machines and be viewed by them.  This also means that the Analysis report is the perfect artifact to be produced as part of your automated build process.  Many teams will often have a dashboard of some description that shows statistics and metrics on the build process and the quality of the code that's been built.  NDepend's analysis report can be easily integrated into such a dashboard, or even hosted on a website domain as it's a static HTML file with some linked images.

The analysis report also contains static snapshot versions of the various graphical metrics representations that can be viewed inside NDepend such as the Dependency Graph, Dependency Matrix, Treemap Metric View and another interesting graph which plots Abstractness vs Instability.

This graph is an interesting one and can give a good idea of which areas of the analysed solution are in need of some care and attention but for very different reasons.

Image

The graph plots each assembly within the solution against a diagonal baseline that runs from the top left corner of the graph to the bottom right.  Starting at the bottom right, as we move along the x-axis, towards the left, we get more "instable" and moving up the y-axis towards the top, we get more "abstract".  Abstractness is how much that project depends upon abstract classes or interfaces rather than concrete classes.  This indicates that the project is somewhat easier to change and is more malleable.  Instability is how easy such change can be made without breaking the code by analysing the various inter-dependencies within that code.  This is a combination of two other metrics, those of afferent coupling and efferent coupling.  Afferent coupling is the number of types that depend upon the given assembly whilst efferent coupling is the number of external types that this assembly depends upon.  We want our assemblies to remain inside the diagonal green area of the graph.  If an assembly's code is high in abstractness but also has a very low amount of either internal or external coupling, we're headed towards the "zone of uselessness".  This isn't a place we want to be, but correcting this may be reasonably easy since code being well within this area means that it doesn't do an awful lot and so could probably be fairly easily removed.  If, however, an assembly's  code is both overly rigid in its dependence on concrete types and also rigid in its instability (i.e has a high amount of internal and/or external coupling) we're headed towards the "zone of pain".  This is an area that we really don't want to be in since code being well within this area means that it's probably doing a lot - maybe even too much - and it's going to be very difficult to change it without breaking things.

Keeping an eye on each assembly's place within the Abstractness vs Instability graph can help to ensure your code stays at both the correct level of abstraction without being too abstract and also ensuring that the code remains fairly well decoupled allowing easier modification.  Metrics such as the abstractness vs instability graph and the treemap metric view are some of my favourite features of NDepend as they present their data in a very easily consumable format that can be viewed and understood "at a glance" whilst being based on some very complex underlying data.

But wait, it gets better

Similar to the Queries and Edit window that allows us to define our own code queries and rules as well as modifying existing ones, NDepend includes a very powerful Search function. 

Image

This function uses the same kind of interface as the Query and Rules Edit window, but allows arbitrary queries to be performed against the analysed code with the matching results showing immediately within the window.  NDepend's search functionality can be optionally constrained to methods, types, assemblies or namespaces, but by default will search across all code elements.  Searches can be based on simple text matching, regular expressions or even full CQLinq queries.  This makes NDepend's search functionality even more powerful than something like the "Go to anything/everything" functionality that's been a part of ReSharper for a long time and is now also a part of Visual Studio itself as we can not only search for pieces of our code based on textual matches, but we can also use the highly expressive power of NDepend's CQLinq syntax, objects and properties to search for code based upon such things as method size, code lacking unit tests, code with no callers or too many callers.  As with the Queries and Rules edit window results, we can double-click on any result and be navigated directly to the matched line of code.  One quirk I did notice when using the Search functionality is that, when searching for matching elements inside of a method - i.e. fields, or other certain elements of code such as interfaces, you must click on the containing method or type that's shown as part of the search results and not the actual interface or field name itself.  Clicking on these will cause NDepend to display various errors as to why it can't navigate directly to that element.  I suspect this is due to the fact that NDepend's analysis works on the compiled intermediate language and not directly against the source code, but since this is a code search function, it would be nice if such navigation were possible.

As well as NDepend's ability to analyse the source code of your solution, it can leverage other tools' artifacts and data to improve its own analysis.  One such major piece of data that NDepend can utilise is code coverage data.  There are a number of tools that can calculate the code coverage of your solution.  This is the amount of code within your solution that is "covered" by unit tests, that is code that is executed as part of one or more unit tests.  Code coverage is built into Visual Studio itself at the highest SKU level, Enterprise, but is also provided by a number of other tools such as JetBrains' dotCover, and NCover.  NDepend can import and work with the code coverage output data of all three of the aforementioned tools, although the files from the dotCover tool have to be provided to NDepend in a special format.  Once some code coverage data is added to an NDepend project, several additional metrics relating to code coverage become available within the complete quite of NDepend metrics, such as identifying code that should have a minimum level of coverage and code whose coverage percentage should never decrease over time as well as other interesting metrics such as the amusingly titled C.R.A.P. metric.  This is the "Change Risk Analyzer and Predictor" metric and gives methods a score based upon their level of code coverage versus their complexity with more complex methods requiring higher code coverage.  Having the ability to integrate code coverage into NDepend's metrics suite is another very nice touch and allows the full set of metrics for your solution to be centralised in a single location.  This also helps to keep automated builds fairly simple without too many moving parts whilst also providing an extensive amount of data on the build.

We've mentioned a few times how NDepend can be integrated into your automated build process.  There's nothing new in having such tools be able to be integrated with such a fundamental part of any software team's daily processes, but whereas some tools simply give you the ability to run them from a command line and then leave you to figure out the exact return values from the executable and other output artifacts and how they might be integrated, NDepend provides some very comprehensive online documentation (even with links to that documentation from the NDepend menus inside of Visual Studio) giving very precise instructions for integrating NDepend into a large number of modern Continuous Integration tools, such as Visual Studio Team Services, TeamCity, Jenkins, FinalBuilder & SonarQube.  This documentation not only includes extensive step-by-step guides to the integration, but also tutorial and walkthrough videos.  This is a very nice touch.

Finally, we've touched upon this above, but NDepend has the ability to perform its analysis in a temporal way.  This essentially means that NDepend can analyse the same solution at different times and then compare different sets of analysis results.  This allows the production of "trends" within the analysis data and is incredibly powerful and possibly one of the most useful features of the NDepend product.  We saw earlier how the Analysis report and the NDepend dashboard contain metrics for such things as Debt rating of the solution, but upon closer examination, we can see that the debt rating and percentage is shown as either an increase or a decrease since the last time that the solution's debt was calculated:

Image

Having such data available to us is incredibly powerful.  We can now track these metrics over time to see if our solution is generally improving or decreasing in quality.  Something that is very important for any software development team who must maintain a solution over a long period of time.   NDepend allows the setting of a "baseline" set of metrics data against which new sets of recently analysed data can be compared.  This gives us a powerful ability to not only compare our most recent set of analysis results with the immediately prior set of results, which if NDepend's analysis is integrated into a continuous build process could be very recently indeed, but also being able to compare our most recent set of results with those of a day ago, a week or a month ago, or even this time last year.  With this we can see not only how our code changes over small time frames, but over larger units of time too.  This ability to view metrics over time frames is helpfully built into many of the interfaces and windows within NDepend, so for example, the Dashboard contains a very useful filter at the top of the window allowing us to set the time range for the charts shown within the dashboard.  NDepend includes the ability to generate a full set of metrics around trends within the solution, so we can for example, track such things as how many "issues" that an earlier NDepend analysis identified have been fixed as well as how many new issues have been introduced, and many of the built-in rules that NDepend defines are specifically based upon changes to the code over time.  For example, there is a complete category group, "Code Smells Regression" that contains many rules starting with the name "From now...".  These specific rules will be broken if code quality for the specific attribute measured falls over time, from one analysis run to the next.  This helps to ensure that code is not only improved in quality, but stays that way.  NDepend doesn't stop there with its ability to view changes over time, and the CQLinq query language includes a large amount of properties that specifically make use of the ability to examine such change in the code over time.  Objects/Domains such as Issues and IssuesInBaseline, and properties such as NewerVersion on a method can allow us to write arbitrary queries comparing parts of our code over time.  So, for example, the following query shows us which methods have increased in cyclomatic complexity:

from m in Methods
where m.NewerVersion().CyclomaticComplexity > m.OlderVersion().CyclomaticComplexity
select m

This is exceptionally powerful data to have on a software project developed and maintained by a commercial team as it allows very informed decisions to be made regarding how much effort within a given amount of work (say, an agile sprint) should be dedicated not to adding additional features and functionality but to maintenance of the existing code that's already there.  Moreover, not only do we know how much effort we should expend on code maintenance but also we know exactly where that effort should be directed to have maximum impact in improving code quality - a critical factor in the success of any software project.

Conclusion

So, we've looked at NDepend and examined some of its powerful static analysis features and we've seen how such a tool could be integrated into a broader software development life-cycle, but should you actually use NDepend for your static analysis needs?

NDepend is a commercial tool and its license comes in two flavours.  A Developer license is required for each developer that will use NDepend on their own development machine, and this license is currently priced at 399 euros.  If you want to use NDepend as part of your Continuous Integration process on a build machine, you'll need a separate license for that which is currently priced at 799 euros.  All licenses are for one year and can be renewed with a 40% discount on the initial purchase price.  All ongoing licenses are eligible for continued upgrades to the product.  All licenses are perpetual fallback licenses meaning that you can stop renewing and still keep using your current version, although you'll lose the ability to upgrade to newer versions of the product.

As a software tool, it's expensive but not exorbitantly so.  For any commercial software development team that cares about the quality of their code, and especially a team tasked with maintaining a software product over time, NDepend is not really expensive at all when compared to the cost of other tools that typical software development teams will invest in.  In such an environment, NDepend is well worth the money.  NDepend has the ability to fit in with most modern team's processes and workflows and has the power to be able to be configured and tweaked to any given team's requirements regarding what constitutes good vs bad code and the level of "quality" that the code has to adhere to before it's considered good enough for production release.  This gives the tool incredible power and flexibility.  That said, I think it's perhaps too expensive a tool for individual software developers who may wish to purchase just for themselves.  Also, to the best of my knowledge, there's no free version of NDepend available for open source projects, as there are of other tools such as ReSharper, OzCode, TeamCity, AppVeyor and many more. For this reason, I'd love to see another SKU or edition of the NDepend product that is more affordable for individual developers.  Perhaps an "NDepend Lite" that ships only as a Visual Studio plug-in and removes some functionality such as the ability to edit CQLinq queries and the tracking of metrics over time and having a more affordable price for individual developers.

Setup of NDepend is very easy, being delivered as a simple .zip archive file.  The integration of the Visual Studio plugin is also very simple to install and use, and registering your software license is straight-forward and uncomplicated so you can be up and running with the NDepend product in a matter of minutes.  In a way, this is deceptive as once you're back in Visual Studio (or even the Visual NDepend stand alone IDE) it can be difficult to know exactly where to start.  Again, this is largely my own fault for trying to dive straight in without referring to the documentation.  And it must be said that the NDepend's documentation is exceptionally good.  Although newer versions of NDepend try to improve ease of accessibility into the software, the newly added "guide tool-tips" are often not necessarily easily discoverable until you stumble onto them resulting in a bit of a chicken-and-egg situation.  It's hard to really criticise the product for this, though, as by its very nature, it is a highly technical and complex product.  Having done things the hard way, my recommendation would be to spend some time reading the (very good) documentation first and only then diving into the product.

Although the initial learning curve can be quite steep, usage of the product is very easy, especially once you get familiar with the various functions, menu options and windows. And if you're stuck at any point, the excellent documentation and handy tool-tips are always only a mouse click (or hover) away.  Also, NDepend is quick.  Very quick.  Once analysis of a solution is completed - using my test solution, NDepend was able to perform its analysis in less than half the time it takes to fully compile and build the solution - the complete code-base can be queried and searched in near real-time.  And as well as providing a plethora of highly informative and profoundly useful metrics on the analysed code, NDepend also contains such functionality as a general-purpose "code-search-on-steroids" that give other tools offering similar functionality (i.e ReSharper) a run for their money.

NDepend isn't perfect, and I did discover a few quirks with the product as I used it.  One quirk seemed to be a bit of a false positive which was a broken built-in rule that stated that your own methods shouldn't be called "Dispose".  The section of code that was identified as breaking this rule did indeed have a Dispose() method, but it was necessary due to interface implementation.  In this case, it wasn't the IDisposable interface that the class was implementing, but rather the IHttpModule interface that forced me to implement the Dispose() method, yet NDepend didn't seem to like that and flagged that as a broken rule.  NDepend also has a rule about the names of methods not being too long, however, I was finding that my unit tests were frequently falling afoul of this rule as most of them had quite long method names.  This is quite normal for unit tests, and is often considered good or best-practice to name unit tests in such a way with a very expressive name.  You can always decide not to include the test assemblies in the NDepend analysis, but then you may miss out on other NDepend rules that would be very helpful, such as warning when a method has too many lines of code.  It would be nice to be able to tell NDepend to omit certain files or assemblies from certain rules or categories of rules similar to how you can tell ReSharper to ignore certain inspections via comments within your code.  You can omit code from queries at the moment, however, in order to achieve this you have to manually edit the CQLinq of the query, which isn't the best or simplest way.  On the plus side, NDepend have said that they will support a [SuppressWarning] attribute as well as custom views of code within queries, making ignoring certain code much easier in a future version of the product.

Another quirk I found was that navigating to certain code elements from either the search results or results of code rules in the Queries and Rules Edit window can sometimes be problematic.  One rule states "Avoid interfaces too big" and so the results shown in the window are all matched interfaces that the rule has deemed too big.  However, trying to navigate to the interface by double-clicking on the matching result gives an error with a reason of "N/A because interface", which is quite confusing and somewhat unintuitive.  It would be nice if navigation to code via double-clicking on query results was more universal, meaning that all results for all code elements would navigate to the corresponding section of source code when clicked.

Image

An omission from NDepend's functionality that I'd love to see included out-of-the-box would be the ability to identify "code clones" - sections of code that are duplicated in various different parts of the solution.  Whilst this isn't a "metric" as such, it's certainly one of the first ports of call that I turn to when looking to refactor and improve an existing large code base.  In incumbent code-bases, there is frequently a fair amount of duplicated code that exists and removing as much duplication as you possibly can not only reduces the number of lines of code (so long as readability and functionality are maintained, less code is always better than more code) but it also helps in the refactoring of methods against other metrics such as cyclomatic complexity and other factors.  NDepend does include a "Search for Duplicate Code" powertool although in my experience of comparing its output to that of the "Analyze solution for Code Clones" option in Visual Studio Enterprise, I found that it didn't detect as many instances of duplicated code.

Image

This is almost certainly due to the difference in how the two tools work - the NDepend feature looks for methods that make many calls to a "set" of other methods, whilst the Visual Studio function will look for duplicates of actual code itself.  The NDepend feature works in the way it does no doubt due to examining the compiled code rather than the raw source code.  However, in the test solution that I was using, which I knew contained many sections of identically duplicated code inline within many different methods,  NDepend failed to identify a lot of the duplicated code (not to mention taking significantly longer to perform the analysis - a whopping 42 minutes for a 38KLOC solution), whilst the Visual Studio feature detected them all in less than 5 minutes.  Also, as the NDepend functionality is only provided from the command line invoked Powertools, it's not easy to immediately navigate to the offending lines of code by double-clicking as we can with features that are part of the Visual Studio plug-in.  The good news, though, is that in speaking to Patrick, the NDepend lead developer, he tells me that a first-class in-built function for detecting code duplicates is on the roadmap for a future version of the NDepend product.  Hopefully, this functionality shouldn't be too long in coming as NDepend is a very actively developed product, receiving frequent updates.

One other feature I'd love to see in a future version of NDepend would be some dashboard metrics or query rules around identifying connascence within a software solution. Admittedly, some of the levels of connascence are dynamic and so can't really be identified without running the actual code, however, the first five types of connascence can definitely be identified via static analysis and it'd be great if NDepend could include identification of these out-of-the-box.

Overall, I would definitely recommend NDepend if code quality is important to you.  For any commercial software team charged with maintaining a code-base over a period of time, quality of that code-base is of critical importance to the overall success of the software and the ability and ease of continued development.  A tool like NDepend provides great power and insight into the quality of the code-base both right now as well as how that quality fluctuates over time.  NDepend's ability to both aggregate quality metrics into a single, simple debt percentage or rating as well as its ability to identify the many, individual specific issues that the code-base suffers from and how those issues are introduced or resolved over time is a huge help for any team in knowing not only exactly when to focus effort on quality improvement but where to focus that effort.