As we have seen in the previous article how does a software degrades by time and what is software quality, lets now see if there are any best practices that can be followed which will help us to keep the code in better shape and give early warnings before the code rot.
I have come accross couple of tools that will help us to get early warnings when we side track our quality path. Start filling your toolkit with the below tools.
All developers think differently. Some follow coding standards some dont. How to make sure everyone writes the code in similar style following all the coding standards that are set, so that the overall code in differnt modules written by different developers looks same for others to understand better quickly. Here comes the first tool in your toolkit Resharper.
Resharper is a productivity tool for .Net. Its a plugin which integrates with Visual Studio and is very handy during coding. Apart from its beautiful default settings, it has options to customize setting to cater our company standards. It helps in keeping the code upto the company standard. As soon as we break any of the rule set, the particular code is underlined with warning and we can ask Resharper to fix it for us. All developers can set the same rules, to make the code look similar in all different modules.
Resharper cheat sheet is available here.
While working on a file, resharper shows summary of all the warning and errors that are set for the coding guidelines and are violated with an image on the top right cornor of the editor. Now the teams responsibility is to fix all the warnings and errors suggested by the resharper in the file before check-in. If the top right corner shows a checked green icon, you are at the standard required.
As we have seen in the previous post about the parameters those define software quality, now the question comes popping
- “How the heck can I know where the quality is poor in my project with hundreds of dlls and multi million of lines of code??”
- “I am interested, but where to start??”
- “I dont want to waste more time, is there a tool which will give me the results quickly??”
- “How often to run it?? Can I automate it??”
There are couple of tools that can help to monitor quality parameters some of them are FxCop, Sonar and NDepend. I have used NDepend extensively for .Net since its very easy to use, fast, excellent documentation in its site, has a great support team, has a sister product called JArchitect for Java. So once you have rules configured for .Net products the same can be used even for Java products too that is a wowww factor.
This tool when run on a project, lists out all the quality parameter voilations like Cyclic Dependencies, Unused code, Abstractness, Cyclomatic complexity, variables and fields in method and classes, even code that is breaking the company coding standards, performance issues and much much more. It also gives you flexiblity to write a custom rule in a format similar to SQL called CQL (Code Query Language) in older versions and CQLinq in new versions.
For eg: CQLinq to list all methods that are larger than 30 lines of code
warnif count > 0
from m in Application.Methods
where m.NbLinesOfCode > 30
NDepend also has the features to compare 2 versions of the code, for eg: you ran the tool in first week and in second week you ran it again, you can compare the 2 builds and check how the quality changed, which is a good matrix to know how are we improving in quality.
This can also be integrated with contineous integrations tools, to run automatically during daily builds to monitor the quality every single day. For critical rules we can even break the builds and email the team about the code that was recently checked in that is not to quality.
The days were so good, when I used to love copy and pasting the code. I was so productive, delivering similar features were like a day’s or a weeks job for me and my managers used to praise me a lot, LOL.
But now, when I see a duplicate code my blood boils, what the heck happened to me? Is the duplicate code good or bad?
Duplicate code is like a cancer; as it grows it starts showing how bad is its presence and how hard it is to cure it, you need to invest huge money and time to cure it.
But why is duplicate code bad??
- A fix at one peice of duplicate code will force us to change at all the places where we copy pasted the code. If we miss, god help the customers.
- The person who has copy pasted is the only one who knows where and all they are other jewels(clones). What if a new guy comes to fix an issue in the section.
- It breaks core design principles such as DRY, SRP and OCP.
- Increases the LOC, More code to maintain, bigger assemblies
Simian is really fast and has couple of options to include and exclude files and folders, set mininum duplicate line count based on our own standard, and gives output in differnt format like XML, CSV etc. I prefer xml with which I can write my own tool to read the xml and use the data to show duplicates in veriety of styles to analyze better.
As said in wiki : Continuous integration (CI) is the practice, in software engineering, of merging all developer working copies with a shared mainline several times a day.
Why do we have to do it so frequently?
In one of my old projects we used to generate a build only at the end of the iteration or cycle that used to be once in a month and that day would be like a festival for us. No work, whole team in one cubicle surrounding a person who builds the product ( he would be totally freaked), we would be going home really really late, all managers starring at us as we have done some crime, postponing the delivery was a habbit.
But, what was the problem,
- Taking latest copy from the repository having all the changes from all the developers after a month and building it, guess what hundreds of errors, fights between the developers on why the interfaces were changed and not intimated.
- Many files referred locally but not checked into the repository
- After fixing the build issues and generating the build, we could find missing functionalities at the boundries, on everyone assuming the other would handle it.
- Rework, nights at office
So generating builds frequently and doing integration tests regularly will ensure the final night we are at home sleeping peacefully in bed.
There are couple of tools to help us resolve the issue. CruiseControl.Net and FinalBuilder are on the top list. These tools can be scheduled to take the latest copy of your code from repository automatically and build the entire project/product and send notification via mails on failure or success of the builds.
Finally what we get from all these??
Using all these tools together we can improve the quality of overall software and keep monitoring them… Ask me how??
Use the continous integration tools to build the assembiles and run custom tools like NDepend/Simian to check quality parameters and report any breakages to the team through emails.
By having such systems in place we can get early notification and take corrective actions to resolve the issues as an when they are introduced. Imagine our leads coming and telling us that “yesterday you checked in one class and in it a method named I_AM_BIG_METHOD is too big, please make it small or move it to some other class to make SRP compliant” on the next day after we have checked in when its fresh in our mind, rather than making us to sit and fix it on the final day of the release or after few months when we have totally forgotten the code we ourselves have written.
As its said An apple a day keeps doctor away, for programmers its Using these tools everyday keeps bad code away.
Clean and happy coding,