.NET Code Metrics

  •        178

This project is intended to provide a Visual Studio addin that provides complexity calculation for methods

http://netcodemetrics.codeplex.com/

Tags
Implementation
License
Platform

   




Related Projects

ReviewPal - The Code Review Companion for .Net.

  •    

ReviewPal, the Code Review Companion for .Net. This is an Add-In / Extension for Visual Studio 2008 & Visual Studio 2010. The aim of the Add-In / Extension is to do a source code review within the Visual Studio IDE where code makes more sense and most readable.

Sonar - Code quality for all languages

  •    Java

Sonar is an open platform to manage code quality. As such, it covers the 7 axes of code quality: Design, Code duplication, Unit test, Code complexity, Potential bugs, Coding rules, Comments. It has support for multiple languages like .NET, PHP, Java etc.

Panopticode - Gathers, Correlats, and displays code metrics

  •    Java

The Panopticode project provides a set of open source tools for gathering, correlating, and displaying code metrics. It evaluates software quality and ensures coding standards. It installs and configures set of tools like Emma or Cobertura for unit test code coverage, Checkstyle to validate the coding standards, JDepend to check the dependency, JavaNCSS for Cyclomatic Complexity metrics.

JavaNCSS - A Source Measurement Suite for Java

  •    Java

JavaNCSS is a simple command line utility which measures two standard source code metrics for the Java programming language. The Cyclomatic Complexity metrics are collected globally, for each class or for each function. It is also integrated with Ant task. NCSS means Non Commenting Source Statements, Source code except comments will be used for metrics calculation.

FxCop Integrator

  •    

FxCop Integrator allows to integrate stand-alone FxCop and Code Metrics PoewrTool into VS2010. You can analysis your source code on VS2010 Pro.


TeamReview - TFS Code Review

  •    

The most complete solution for Team System Code Reviews: a specific work item type and a Visual Studio add-in for a completely in IDE code review experience. TeamReview exploits the advantages of Team System and VSX to reduce waste and surface new business value from code reviews

Code Review Add-In

  •    

The Code Review Add-In helps the process of reviewing code and publishing the results as work items to the Team System Server for team members to review them.

DataStructureAndAlgorithms - Write code that run faster, use less memory and prepare for your Job Interview

  •    Java

In this course you will learn how to Analysis algorithms like Sorting, Searching, and Graph algorithms. And how to reduce the code complexity from one Big-O level to another level. Furthermore, you will learn different type of Data Structure for your code. Also you will learn how to find Big-O for every data structure, and how to apply correct Data Structure to your problem in Java. By the end you will be able to write code that run faster and use low memory. You Also will learn how to analysis problems using Dynamic programming.

reviewdog - :dog: Automated code review tool integrated with any code analysis tools regardless of programming language

  •    Go

"reviewdog" provides a way to post review comments to code hosting service, such as GitHub, automatically by integrating with any linter tools with ease. It uses an output of lint tools and posts them as a comment if findings are in diff of patches to review. reviewdog also supports run in the local environment to filter an output of lint tools by diff.

word2vec-sentiments - Tutorial for Sentiment Analysis using Doc2Vec in gensim (or "getting 87% accuracy in sentiment analysis in under 100 lines of code")

  •    Jupyter

However, Word2Vec documentation is shit. The C-code is nigh unreadable (700 lines of highly optimized, and sometimes weirdly optimized code). I personally spent a lot of time untangling Doc2Vec and crashing into ~50% accuracies due to implementation mistakes. This tutorial aims to help other users get off the ground using Word2Vec for their own research. We use Word2Vec for sentiment analysis by attempting to classify the Cornell IMDB movie review corpus (http://www.cs.cornell.edu/people/pabo/movie-review-data/). The specific data set used is available for download at http://ai.stanford.edu/~amaas/data/sentiment/. The code to just run the Doc2Vec and save the model as imdb.d2v can be found in run.py. Should be useful for running on computer clusters.

C and C++ Code Counter

  •    Java

CCCC is a tool which analyzes C++ and Java files and generates a report on various metrics of the code. Metrics supported include lines of code, McCabe's complexity and metrics proposed by Chidamberamp;Kemerer and Henryamp;Kafura.

Review Board - Code Review Tool

  •    Python

Review Board is a powerful web-based code review tool that offers developers an easy way to handle code reviews. It scales well from small projects to large companies and offers a variety of tools to take much of the stress and time out of the code review process. Review Board supports reviewing code for Bazaar, CVS, Git, Mercurial, Perforce, and Subversion repositories.

scc - Sloc, Cloc and Code: scc is a very fast accurate code counter with complexity calculations and COCOMO estimates written in pure Go

  •    Go

A tool similar to cloc, sloccount and tokei. For counting physical the lines of code, blank lines, comment lines, and physical lines of source code in many programming languages. Goal is to be the fastest code counter possible, but also perform COCOMO calculation like sloccount and to estimate code complexity similar to cyclomatic complexity calculators. In short one tool to rule them all and the one I wish I had before I wrote it.

code-review-tips - :microscope: Common problems to look for in a code review

  •    Javascript

Code reviews can inspire dread in both reviewer and reviewee. Having your code analyzed can feel as invasive as being screened by the TSA as you go off to your vacation. Even worse, reviewing other people's code can feel like a painful and ambiguous exercise, searching for problems and not even knowing where to begin. This project aims to provide some solid tips for how to review the code that you and your team write. All examples are written in JavaScript, but the advice should be applicable to any project of any language. This is by no means an exhaustive list, but hopefully this will help you catch as many bugs as possible long before users ever see your feature.

puma-scan - Puma Scan is a software security Visual Studio extension that provides real time, continuous source code analysis as development teams write code

  •    CSharp

Puma Scan is a .NET software secure code analysis tool providing real time, continuous source code analysis as development teams write code. In Visual Studio, vulnerabilities are immediately displayed in the development environment as spell check and compiler warnings, preventing security bugs from entering your applications. Puma Scan also integrates into the build to provide security analysis at compile time. The Puma Scan Community Edition is licensed under the Mozilla Public License (MPL) version 2.0.

NodeJsScan - NodeJsScan is a static security code scanner for Node.js applications.

  •    Python

Static security code scanner (SAST) for Node.js applications. The command line interface (CLI) allows you to integrate NodeJsScan with DevSecOps CI/CD pipelines. The results are in JSON format. When you use CLI the results are never stored with NodeJsScan backend.

reviewboard - An extensible and friendly code review tool for projects and companies of all sizes.

  •    Python

Review Board is an open source, web-based code and document review tool built to help companies, open source projects, and other organizations keep their quality high and their bug count low. We began writing Review Board in 2006 to fill a hole in the code review market. We wanted something open source that could be flexible enough to work with a variety of workflows, and could take the pain out of the code review process.

Crap4Net

  •    

C.R.A.P is a code metric aimed at indicating the confidence level a developer should have at its code. It considers the code complexity versus it test coverage. This project helps .NET developers to measure the C.R.A.P level of their code.