A Ruby library which implements ID3 (information gain) algorithm for decision tree learning. Currently, continuous and discrete datasets can be learned.
rubyml decision-tree machine-learningGoJS is a JavaScript and TypeScript library for creating and manipulating diagrams, charts, and graphs. GoJS is a flexible library that can be used to create a number of different kinds of interactive diagrams, including data visualizations, drawing tools, and graph editors. There are samples for flowchart, org chart, business process BPMN, swimlanes, timelines, state charts, kanban, network, mindmap, sankey, family trees and genogram charts, fishbone diagrams, floor plans, UML, decision trees, pert charts, Gantt, and hundreds more. GoJS includes a number of built in layouts including tree layout, force directed, radial, and layered digraph layout, and a number of custom layout examples.
html diagram flowchart orgchart layout graph typescript chart tree uml bpmn editor drawing canvas svg hierarchy family-tree decision-tree mindmap treeview genogram ivr-tree parse-tree concept-map visualization entity-relationship er-diagram radial sankey pert gantt timeline swimlane workflow flow process state sequential-function grafcet sequence circuit dataflow data-flow planogram floorplan seating pipes tree-layout force-directed layered hierarchical circular fishbone ishikawa treemap tree-map subgraph network library js html5Ruby scoring API for Predictive Model Markup Language (PMML).Currently supports Decision Tree, Random Forest Naive Bayes and Gradient Boosted Models.
ruby-gem pmml random-forest classification rubyml machine-learning gradient-boosting-classifier gbm gradient-boosted-models decision-tree naive-bayesSmall JavaScript implementation of algorithm for training Decision Tree and Random Forest classifiers.
decision-tree random-forest machine-learning#Update I've made some update on the data loading logic so now it reads in csv-format file. Previous version is still accessible but it's no longer supported. #Introduction Javascript implementation of several machine learning algorithms including Decision Tree and Logistic Regression this far. More to come.
logistic-regression decision-tree machine-learning classifier c4.5TiMBL is an open source software package implementing several memory-based learning algorithms, among which IB1-IG, an implementation of k-nearest neighbor classification with feature weighting suitable for symbolic feature spaces, and IGTree, a decision-tree approximation of IB1-IG. All implemented algorithms have in common that they store some representation of the training set explicitly in memory. During testing, new cases are classified by extrapolation from the most similar stored cases. For over fifteen years TiMBL has been mostly used in natural language processing as a machine learning classifier component, but its use extends to virtually any supervised machine learning domain. Due to its particular decision-tree-based implementation, TiMBL is in many cases far more efficient in classification than a standard k-nearest neighbor algorithm would be.
machine-learning classification learning-algorithm timbl ib1 igtree decision-tree k-nearest-neighbours nearest-neighbours knn c-plus-plusMonotonicity constraints can turn opaque, complex models into transparent, and potentially regulator-approved models, by ensuring predictions only increase or only decrease for any change in a given input variable. In this notebook, I will demonstrate how to use monotonicity constraints in the popular open source gradient boosting package XGBoost to train a simple, accurate, nonlinear classifier on the UCI credit card default data. Once we have trained a monotonic XGBoost model, we will use partial dependence plots and individual conditional expectation (ICE) plots to investigate the internal mechanisms of the model and to verify its monotonic behavior. Partial dependence plots show us the way machine-learned response functions change based on the values of one or two input variables of interest, while averaging out the effects of all other input variables. ICE plots can be used to create more localized descriptions of model predictions, and ICE plots pair nicely with partial dependence plots. An example of generating regulator mandated reason codes from high fidelity Shapley explanations for any model prediction is also presented. The combination of monotonic XGBoost, partial dependence, ICE, and Shapley explanations is likely the most direct way to create an interpretable machine learning model today.
machine-learning fatml xai gradient-boosting-machine decision-tree data-science fairness interpretable-machine-learning interpretability machine-learning-interpretability iml accountability transparency data-mining interpretable-ml interpretable interpretable-ai lime h2oMachine learning and neural networks are fast becoming pillars on which you can build intelligent applications. The course will begin by introducing you to Python and discussing using AI search algorithms. You will learn math-heavy topics, such as regression and classification, illustrated by Python examples.
python3 jupyter-notebook decision-tree regression neural-network machine-learning-fundamentalsRule based classification with explicit support for ambiguity resolution
classification classifier decision-table decision-treeDDT allows building custom decision trees based in a set of defined rules, programmatically or from json. When making a decision, it allows adding a pre-processing stage to the input before comparing it with the following possible branches of the tree.
decision machine-learning tree decision-tree decison-trees user-treeNodeJS Implementation of Decision Tree using ID3 Algorithm
decision tree classifier classification machine learning decision-tree machine-learning id3Random Forest Library In Python Compatible with Scikit-Learn
data-science machine-learning random-forest scikit-learn machine-learning-algorithms regression pandas classification ensemble-learning decision-treeA tool to build data classification rules using visual flowchart-style decision tree. Uses d3.js v4 for SVG drawing. You can optionally color stroke of nodes and links based on their truthy/falsey status, see demo.css for example.
d3 d3v4 flowchart decision-trees decision-tree-classifier decision-diagrams d3-visualization decision-tree
We have large collection of open source products. Follow the tags from
Tag Cloud >>
Open source products are scattered around the web. Please provide information
about the open source projects you own / you use.
Add Projects.