Skip to content

Neural-network based on the paper "Learning internal representations by error propagation" by DE Rumelhart, GE Hinton, RJ Williams - 1985 - apps.dtic.mil

JimStockwell/Rumelhart1985

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

36 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Rumelhart1985

In 1985, Rumelhart and others introduced back propagation and gradient descent as a method of training neural networks that have more than just an input layer and an output layer.

This Java project implements the algorithm described in Rumelhart's paper.

Installation and Use

Be sure Java 11 and Maven are installed.

Copy the root directory and everything below it onto your machine.

Run "mvn clean test", and the library will be compiled and tests run.

The main class of the library is Network. A network is given a structure with "withStructure", and fed training patterns (that must match the input and output structure of the network) via "Network.learn()".

The NetworkTest.java file demonstrates proper (and improper) use.

Technologies Used

Core Java.

Acknowledgements

"Learning Internal Representations by Error Propagation"

  • DE Rumelhart, GE Hinton, RJ Williams - 1985 - apps.dtic.mil

About

Neural-network based on the paper "Learning internal representations by error propagation" by DE Rumelhart, GE Hinton, RJ Williams - 1985 - apps.dtic.mil

Topics

Resources

Stars

Watchers

Forks

Languages