JBoost includes state-of-the-art algorithms and can be used by researchers to quickly implement new boosting algorithms. Machine learning novices can quickly learn and utilize the power of boosting because JBoost also includes a set of easy to use scripts.
Here are some key features of "JBoost":
· Multiple boosting algorithms. JBoost can utilize the power of AdaBoost, exploit the noise resistant properties of BrownBoost, or asymmetric cost of NormalBoost (soon to come!).
· Visualization. Original boosting methods focused on combining many decision trees, causing the final combined classifier to have millions of nodes. ADTrees have the same expressive power, while containing orders of magnitude fewer nodes. This leads to robust classifiers that are easily understandable and human editable.
· Multi-Label. Many problems demand that each example can be classified in multiple ways. For instance, a movie may be "romantic" and a "comedy".
· Multi-Class. Many classification algorithms (including AdaBoost) were originally designed to classify two classes (for instance, is the car "broken" or "not broken"). JBoost can classify an arbitrary number of classes.
What's New in This Release: [ read full changelog ]
· Fix possible bugs with ADD_ROOT
· Remove support for MultiLabel and MultiClass from java
· Add more test cases
· Add BrownBoost