Paper Status Tracking
Contact us
[email protected]
Click here to send a message to me 3275638434
Paper Publishing WeChat

Article
Affiliation(s)

ABSTRACT

This paper presents a study on the improvement of MLNNs (multi-layer neural networks) performance by an activity function for multi logic training patterns. Our model network has L hidden layers of two inputs and three, four to six output training using BP (backpropagation) neural network. We used logic functions of XOR (exclusive OR), OR, AND, NAND (not AND), NXOR (not exclusive OR) and NOR (not OR) as the multi logic teacher signals to evaluate the training performance of MLNNs by an activity function for information and data enlargement in signal processing (synaptic divergence state). We specifically used four activity functions from which we modified one and called it L & exp. function as it could give the highest training abilities compared to the original activity functions of Sigmoid, ReLU and Step during simulation and training in the network. And finally, we propose L & exp. function as being good for MLNNs and it may be applicable for signal processing of data and information enlargement because of its performance training characteristics with multiple training logic patterns hence can be adopted in machine deep learning.

KEYWORDS

Multi-layer neural networks, learning performance, multi logic training patterns, Activity function, BP neural network, deep learning

Cite this paper

References

About | Terms & Conditions | Issue | Privacy | Contact us
Copyright © 2001 - David Publishing Company All rights reserved, www.davidpublisher.com
3 Germay Dr., Unit 4 #4651, Wilmington DE 19804; Tel: 1-323-984-7526; Email: [email protected]