Affiliation(s)
1. Mechanical Systems Engineering Course, Graduate School of Engineering and Science, University of the Ryukyus Senbaru 1, Nishihara, Okinawa 903-0213, Japan
2. Faculty of Engineering, University of the Ryukyus Senbaru 1, Nishihara, Okinawa 903-0213, Japan
ABSTRACT
In this paper, we present a study on activity
functions for an MLNN (multi-layered neural network) and propose
a suitable activity function for data enlargement processing. We have carefully
studied the training performance of Sigmoid, ReLu, Leaky-ReLu and L & exp.
activity functions for few inputs to multiple output training patterns. Our
MLNNs model has L hidden layers with two or three inputs to four or six outputs
data variations by BP (backpropagation) NN (neural network) training. We focused on the multi teacher
training signals to investigate and evaluate the training performance in MLNNs
to select the best and good activity function for data enlargement and hence
could be applicable for image and signal processing (synaptic divergence) along
with the proposed methods with convolution networks. We specifically used four
activity functions from which we found out that L & exp. activity function
can suite DENN (data enlargement neural network) training since it could give the
highest percentage training abilities compared to the other activity functions
of Sigmoid, ReLu and Leaky-ReLu during
simulation and training of data in the network. And finally, we
recommend L & exp. function to be good for MLNNs and may be applicable for
signal processing of data and information enlargement because of its
performance training characteristics with multiple teacher training patterns
using original generated data and hence can be tried with CNN (convolution
neural networks) of image processing.
KEYWORDS
Data enlargement processing, MLNN, activity function, multi teacher training signals, BP NN, CNN.
Cite this paper
References