Abstract: Learning the gradient of neuron's activity function
like the weight of links causes a new specification which is
flexibility. In flexible neural networks because of supervising and
controlling the operation of neurons, all the burden of the learning is
not dedicated to the weight of links, therefore in each period of
learning of each neuron, in fact the gradient of their activity function,
cooperate in order to achieve the goal of learning thus the number of
learning will be decreased considerably.
Furthermore, learning neurons parameters immunes them against
changing in their inputs and factors which cause such changing.
Likewise initial selecting of weights, type of activity function,
selecting the initial gradient of activity function and selecting a fixed
amount which is multiplied by gradient of error to calculate the
weight changes and gradient of activity function, has a direct affect
in convergence of network for learning.
Abstract: In this paper a way of hiding text message (Steganography) in the gray image has been presented. In this method tried to find binary value of each character of text message and then in the next stage, tried to find dark places of gray image (black) by converting the original image to binary image for labeling each object of image by considering on 8 connectivity. Then these images have been converted to RGB image in order to find dark places. Because in this way each sequence of gray color turns into RGB color and dark level of grey image is found by this way if the Gary image is very light the histogram must be changed manually to find just dark places. In the final stage each 8 pixels of dark places has been considered as a byte and binary value of each character has been put in low bit of each byte that was created manually by dark places pixels for increasing security of the main way of steganography (LSB).