Neuroph Changelog

What's new in Neuroph 2.85

Apr 17, 2014
  • Added new features in GUI and handy API extensions.

New in Neuroph 2.4 (May 25, 2010)

  • Perceptron and MultiLayerPerceptron now have Linear transfer functions in input layer (looks like this improved learning, and thats right to do)
  • Changed the way ThresholdNeuron calculates output - it used to compare total input with thershold, now it does substraction totalInput-thresh
  • Since it has Step transfer function on output it makes no difference on final result, but it has better model for visualisation
  • Training monitor is now displayed as internal frame so it does not hide behind the main frame
  • New icons for toolbar buttons
  • Created start.bat for easyneurons
  • Default initial setting of max error 0.01 for all supervised learning rules (many users forget to set this setting when training from code)
  • Added load(InputStream inputStream) method to NeuralNetwork class to enable the use of getResourceAsStream to load neural network from jar
  • Added BiasNeuron class, which provides bias feature for MLPs and other networksBias neuron allways has high output level, and dont has inputs
  • Added bias neuron in MultiLayerPerceptrons
  • Option to create direct conenctions from input to output layer
  • User can choose which learning rule want to use for MLP from GUI: Basic backpropagation, Backpropagation with momentum or new Dynamic backpropagation which provides new learning features
  • Total network error formula fixed (again): total_error = (1/2n) sum(e)^2 Now we multiply with 1/2n, and before it was just 1/nThe original formula use 1/2n
  • Pause learning feature - user can pause learning threads from gui and code
  • Created PerceptronLearning rule which is LMS based learning rule for perceptrons (but its not the same as BinaryDeltaRule)
  • Added hasReachedStopCondition() to SupervisedLearning class so we can override and create custom stoping condition in derived classes if needed
  • Added new stopping condition 'min error change' to SupervisedLearning so we can specify that we want to stop learning if error change get to small for some number of iterations (when it gets stuck in local min)
  • Added doOneLearningIteration method to IterativeLearning which allow to perform step by step learning
  • Aded DynamicBackpropagation which can use dynamic learning rate and momentumIts possible to specify min and max values, and change rateIf the total network error is decreasing both parameters are increased in order to speed up error lowering
  • When the error is increasing both values are decreased to minimize the error growth
  • Improved thread sync for error graph, the training is faster and drawing smoother
  • Added initializeWeights() methods to NeuralNetwork class to provide a way to initialise the network with the same weights every time
  • Neuron and its components creation using reflection in NeuronFactory: provides powerful mechanism for creating/adding custom neurons and transfer functions
  • Added Properties and modified NeuronProperties class (util package), so now it accepts neuron specification in athe form of (key, value) collection
  • where values are Class instances
  • Perceptron and Backpropagation samples in easyNeurons which provide learning visualization
  • Neuron properties are now displayed in Internal farme so it does not hide behind the main frame when use clicks somewhere elseIt also shows neurons class
  • Added mechanism for defining constraints and validation on size for input and output vectors in training set - Trainingset constructors can accept size for inputand/or output vector,
  • and each training element is checked before it is added to training set
  • Fixed bug when importing training set if there was and empty line in file (exception was thrown)
  • Stock Market Prediction samples
  • OCR tools and API (handwriting and text recognition)

New in Neuroph 2.3.1 (May 15, 2010)

  • Changed image recognition API, so the color mode is automaticaly detected from settings used for network training - removed unnecessary methods
  • Graph view - migrated graph view to JUNG to 2.0, created specific network layouts and removed unnecessary options
  • ANT build file is now included in release which can build the jars for library and GUI.
  • Several bugfixes for version 2.3:
  • fixed issue with editing gui in NetBeans (fixed NetBeans project file)
  • fixed LMS formula
  • fixed testing in black and white mode for image recognition
  • fixed gui bug - exceptions when creating large networks

New in Neuroph 2.4 RC1 (Mar 15, 2010)

  • Perceptron and MultiLayerPerceptron now have Linear transfer functions in input layer (looks like this improved learning, and thats right to do)
  • Changed the way ThresholdNeuron calculates output - it used to compare total input with thershold, now it does substraction totalInput-thresh.
  • Since it has Step transfer function on output it makes no difference on final result, but it has better model for visualisation.
  • Training monitor is now displayed as internal frame so it does not hide behind the main frame.
  • New icons for toolbar buttons
  • Created start.bat for easyneurons
  • Default initial setting of max error 0.01 for all supervised learning rules (many users forget to set this setting when training from code)
  • Added load(InputStream inputStream) method to NeuralNetwork class to enable the use of getResourceAsStream to load neural network from jar.
  • Added BiasNeuron class, which provides bias feature for MLPs and other networks. Bias neuron alays has high output level, and dont has inputs.
  • Added bias neuron in MultiLayerPerceptrons
  • Option to create direct conenctions from input to output layer.
  • User can choose which learning rule want to use for MLP from GUI: Basic backpropagation, Backpropagation with momentum or new Dynamic backpropagation which provides new learning features.
  • Total network error formula fixed (again): total_error = (1/2n) sum(e)^2 . Now we multiply with 1/2n, and before it was just 1/n. The original formula use 1/2n
  • Pause learning feature - user can pause learning threads from gui and code.
  • Created PerceptronLearning rule which is LMS based learning rule for perceptrons (but its not the same as BinaryDeltaRule)
  • Added hasReachedStopCondition() to SupervisedLearning class so we can override and create custom stoping condition in derived classes if needed.
  • Added new stopping condition 'min error change' to SupervisedLearning so we can specify that we want to stop learning if error change get to small for some number of iterations (when it gets stuck in local min)
  • Added doOneLearningIteration method to IterativeLearning which allow to perform step by step learning
  • Aded DynamicBackpropagation which can use dynamic learning rate and momentum. Its possible to specify min and max values, and change rate. If the total network error is decreasing both parameters are increased in order to speed up error lowering.
  • When the error is increasing both values are decreased to minimize the error growth.
  • Improved thread sync for error graph, the training is faster and drawing smoother.
  • Added initializeWeights() methods to NeuralNetwork class to provide a way to initialise the network with the same weights every time.
  • Neuron and its components creation using reflection in NeuronFactory: provides powerful mechanism for creating/adding custom neurons and transfer functions.
  • Added Properties and modified NeuronProperties class (util package), so now it accepts neuron specification in athe form of (key, value) collection
  • where values are Class instances.
  • Perceptron and Backpropagation samples which provide learning visualization.
  • Neuron properties are now displayed in Internal farme so it does not hide behind the main frame when use clicks somewhere else. It also shows neurons class.
  • Added mechanism for defining constraints and validation on size for input and output vectors in training set - Trainingset constructors can accept size for inputand/or output vector,and each training element is checked before it is added to training set.

New in Neuroph 2.1.1 Beta (Apr 11, 2009)

  • kohonen network sample had bug with drawing
  • fixed
  • backpropagation with tanh was throwing exception
  • fixed And some new features:
  • basic neuron sample
  • unsupervised hebbian learning
  • oja learning rule
  • new GUI components JNeuron, JLayer and JNeuralNetwork which provide cleaner API and new features like neuron color depending of activation level