rss
 
comment(s)

archives
J|F|M|A|M|J|J|A|S|O|N|D
(20##) 10 9 8 7 6 5 4 3 2 1 0 <
 
DesktopWeb FormText   back to backpropagationThu, 23 Sep 2004 21:23:25 GMT # 

[propagate is a funny word] today was spent hacking another neural network. this time i tried the one from pseudonym67: NN.NET The BackPropagation Network. 1st off, pseudonym67 has a number of great AI articles on codeproject ... recommended. on the off chance he/she reads this, i want their email addy. the NN code he provides has 3 different types of NNs: Adaline is too simple, so ignore it. BackPropagation is the one just about everybody uses, and i used today. SOM (or Kohonen) which is just plain weird because it uses unsupervised training, so look at it for cool factor. the thing that sucks about the BackProp NN articles is that they use lame examples with only 1 output and a max of 20 inputs. basically all these can tell you is a boolean response. but the code provided is better than that and can do more complex scenarios. had to muck with the network code a little, but was able to get it to do the same OCR with the network i was using yesterday.

the code was different, but the 2 BPNs were inherently the same. when running, the network from yesterday had the additional usage of Threshold. this one did not have that, but it could optionally use Bias. when learning, yesterdays used 2 different learning rates for the output and hidden layers out of the box. this one can support that as well, it just takes a miniscule effort. this one also adds the concept of Momentum. yesterdays had some error code which was not entirely hooked up. todays has more error code ... but i dont entirely understand what it is supposed to be telling me yet